Let's say I have code that is writing out tons of JSON documents to a log file. Consider these to be as is documents that should be in elasticsearch the same way they have been written to the log files. All I want to do then is efficiently/reliably ship those from the log file to Elasticsearch.
Is filebeat not built for that, since it adds all the filebeat meta data and just puts my document in the "message" field? Do you have to then send it off to logstash only to using a filter to get the original document and then off to elasticsearch? Am I missing something?