I have a log file with 4000 lines and I am trying to consume it with filebeat.
Using elasticsearch output, it works. I can retreive 4000 documents in my index.
But when I use logstash output, my index only have 80 documents !
My logstash config is very simple : I have only a beats input, no filter, and a very simple output to elasticsearch.
The only way I have found to make it work is to configure filebeat with a spool_size of 1 so that logstash has enough time to consume filebeat events.
I am on a Windows platform with the latest logstash-input-beats version (0.9.6)
@ogauchard We found the problem and have a fix ready now. It will be part of RC2. We plan to merge it today, so it will also be part of the next nightly build. It would be great if you could try it out.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.