Logstash cannot consume Filebeat events


#1

Hello,

I have a log file with 4000 lines and I am trying to consume it with filebeat.

Using elasticsearch output, it works. I can retreive 4000 documents in my index.

But when I use logstash output, my index only have 80 documents !
My logstash config is very simple : I have only a beats input, no filter, and a very simple output to elasticsearch.

The only way I have found to make it work is to configure filebeat with a spool_size of 1 so that logstash has enough time to consume filebeat events.

I am on a Windows platform with the latest logstash-input-beats version (0.9.6)

Any suggestion about this behaviour ?


(Andrew Kroh) #2

Hi @ogauchard, see the issue below. I think it is the same problem. It is being worked now.

https://github.com/elastic/filebeat/issues/226#issuecomment-156220617


#3

Thank you for answering so quickly !

It seems to be the same problem. Just have to wait for the fix !


(ruflin) #4

@ogauchard We found the problem and have a fix ready now. It will be part of RC2. We plan to merge it today, so it will also be part of the next nightly build. It would be great if you could try it out.


#5

Good news !

I'll test it and give you feedback probably on monday.


#6

Well done !

The issue is solved with the nightly build.


(ruflin) #7

@ogauchard That is good news. Thanks for testing and reporting.


(system) #8