I'm trying to post data to elasticsearch with curl command. Works fine IF the data is json. But I'm trying to load plain text - specifically log4j logs. Keep getting {"error":"ElasticsearchParseException[Failed to derive xcontent]","status":400}
which is SO unhelpful.
Hope someone can help me correct problem and steer me in right direction.
Thanks guys for the quick reply. Unfortunately I have to use log4j logfiles which of course are not json. Also I've read both links provided over and over & I know that logstash briefly speaks of plain text but give no examples. I've also tried many of the ideas from googling - they don't work. Beginning to look as if I've hit a brick wall
Maybe this will clarify things. I have a successful .conf file and all 3 products (Logstash, Elasticsearch and Kibana) all start successfully. I'm trying the following command - that's causing the problem
curl -s -XPOST localhost:9200/_bulk --data @catalina.2015-10-13.log
If I use a json file instead of the log, it works.
Reason that I'm doing this is although Logstash runs fine and if I query elasticsearch it seems to have some data but when I try to view it in Kibana, I can only view the json data (which is straight sample data from Logstash site) - nothing else. I even tried making part of the log in json and saved it as a json file. the XPOST works, elasticsearch acknowledges the file but cannot see in Kibana.
If you have got logstash running successfully, you would have seen that the actual indexing is to be done by Logstash and not by you running the curl command uploading a log file. As Mark mentioned before, Elasticsearch accepts data in JSON form only so people usually use scripts or something like logstash or fluentd to read and transform data from their formats (like logs, in your case) into an appropriate format which reflects what you want to do with it in Elasticsearch. With logstash, the simplest config would just take the log file, parse each line (assuming you have a log per line) and wrap it in a JSON with a timestamp and a message field. You mention you are using log4j so it won't always be single line per log record (e.g. when errors occur, the stack traces go over multiple lines). The logstash file input config can be configured to use the multiline codec and separate log records based on a regex pattern.
Once you've tested a logstash config, all you really need is to have logstash running and monitoring a directory of log files and it should take care of indexing JSON objects into Elasticsearch.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.