Hello
I use logstash to parse log files and send result documents to elasticsearch. It seems to me that last log/event is not sent until new one appears. The only way I managed to make logstash send last event is turning it off. Then I can see last log in an elasticsearch index or an output file. I don't think it's a right way to get last log for production. Could anybody give me an advise if there is a way to configure logstash so it pushes last event after timeout?
I'm familiar with file plugin flush_interval setting and elasticsearch plugin flush_size/idle_flush_time settings. I tried to play with them but I still have the problem. Perhaps, there're some option I missed. Any ideas?
My config looks like:
input {
file {
path => "/*"
codec => multiline {
pattern => "^%{REDISTIMESTAMP} "
negate => true
what => previous
}
start_position => "beginning"
}
}
filter {
...
}
output {
file { path => "output.log" }
elasticsearch {
index => "myIndex"
}
}
I wonder if the problem is related to a file input multiline code configuration.
Thanks in advance,
Vasiliy