Logstash sends last event only on shutdown


I use logstash to parse log files and send result documents to elasticsearch. It seems to me that last log/event is not sent until new one appears. The only way I managed to make logstash send last event is turning it off. Then I can see last log in an elasticsearch index or an output file. I don't think it's a right way to get last log for production. Could anybody give me an advise if there is a way to configure logstash so it pushes last event after timeout?

I'm familiar with file plugin flush_interval setting and elasticsearch plugin flush_size/idle_flush_time settings. I tried to play with them but I still have the problem. Perhaps, there're some option I missed. Any ideas?

My config looks like:

input {
file {
path => "/*"
codec => multiline {
pattern => "^%{REDISTIMESTAMP} "
negate => true
what => previous
start_position => "beginning"

filter {

output {
file { path => "output.log" }
elasticsearch {
index => "myIndex"

I wonder if the problem is related to a file input multiline code configuration.

Thanks in advance,

This sounds like the behaviour often seen when using multiline codec/filter and Logstash is not able to determine when the final event has been completed.

I frankly can't get rid of multiline codec for now. Is there a way to affect on it in terms of flush/timeout?

There is an issue on GitHub tracking a fix for this. Filebeat has recently added multiline support and allows you to specify a timeout, which should allow you to avoid this problem.

It looks like GitHub issue is fixed. I've just discovered multiline codec auto_flush_interval option in its documentation. I also tried filebeat with multiline plugin and can confirm that there's no problem with last event. I simply want to make it work without filebeat for now. I will try auto_flush_interval and let you know. Thank you!

auto_flush_interval settings fixed my issue. Thank you one more time!