Using Filebeats 1.0.0 but have tested using the nightly release of 1.2 but doesn't seem to have fixed the issue.
Have a bit of an issue when a file gets over 2Gb where the log file no longer seems to be processed. The other files that it is watching are still continuing to be processed.
Sending it to logstash. The mapping is what we had created within logstash. Seems to be going through OK now, think the actual problem was with elastic search, and at this stage no idea what or why.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.