It's hard to be able to help you without knowing the contents of the file you are trying to use for data. Can you paste the first few lines (not the whole file please) here? Then we can see what your problem might be.
Also, you may want to look at Logstash to help you ingest these kinds of files. It was built specifically to handle log files and can index data into Elasticsearch. It will give you a lot more control of the shape of the data which is indexed into Elasticsearch. If you have questions on Logstash after trying it out and reading the linked documentation, checkout the Logstash category of this forum.
Thanks colings86.
Already i tried with logstash.Problem is files are not indexing with logstash.Then i moved to cURL, am able to index the json files.
With logstash document i am not able to solve the problem.It's don't have full information about elasticsearch.yml, logstash-sample.conf and kibana.yml.
Can you please explain what are fields required to configure in these config files for .txt input files.
Obviously this isn;t exactly what you need but it has a file input and an elasticsearch output. If you change the path parameter to match the file pattern for your txt files and remove the filters you should have something to start with. Then (if you need to) you can add filters later to extract information into structured fields. The exact configuration you will need depends entirely on the structure inside you txt files.
If you need to ask specific questions on logstash or you are having issues indexing into elasticsearch using Logstash you can ask questions on the Logstash Category. Please remember to fully describe your problem, include any errors and your Logstash config file (paste this into a gist and link to it if your config flle is long) and also describe what you have tried so far.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.