SImple direct from log file JSON to elasticsearch

Let's say I have code that is writing out tons of JSON documents to a log file. Consider these to be as is documents that should be in elasticsearch the same way they have been written to the log files. All I want to do then is efficiently/reliably ship those from the log file to Elasticsearch.

Is filebeat not built for that, since it adds all the filebeat meta data and just puts my document in the "message" field? Do you have to then send it off to logstash only to using a filter to get the original document and then off to elasticsearch? Am I missing something?

We have used json_lines codec in log stash input -

commands for reference : logstash -f logstash-configcollector.conf < output_collector.json

logstash-configcollector.conf contains :

input {
stdin { codec => json_lines
} }

output {
elasticsearch { hosts => ["localhost:9200"]}
stdout { codec => rubydebug }

output_collector.json contains the json Data

Hope this may be helpfull..


Thank you.

I see how to do it in logstash, but can this not be done directly with filebeat and avoiding logstash? The logstash seems like overkill.

The upcoming Filebeat 5 can deserialize lines of JSON text and ship the resulting objects.

This topic was automatically closed after 21 days. New replies are no longer allowed.