I have a log file with 4000 lines and I am trying to consume it with filebeat.
Using elasticsearch output, it works. I can retreive 4000 documents in my index.
But when I use logstash output, my index only have 80 documents !
My logstash config is very simple : I have only a beats input, no filter, and a very simple output to elasticsearch.
The only way I have found to make it work is to configure filebeat with a spool_size of 1 so that logstash has enough time to consume filebeat events.
I am on a Windows platform with the latest logstash-input-beats version (0.9.6)
Any suggestion about this behaviour ?