Logstash 2.4 output won't execute until stop

Hi,

I am running logstash 2.4 on ubuntu 14.0.4.

My config file is:
input {
file {
path => "/home/spark/.log"
start_position => beginning
codec => multiline {
pattern => "(^\d+.
)"
what => "previous"
}
}
}
filter {
grok {
match => ["path","%{GREEDYDATA}/%{GREEDYDATA:filename}.log"]
}
}
output {
file {
path => "/home/spark/file"
}
azureblob {
storage_account_name => ""
storage_access_key => "
"
azure_container => "test"
}
}

I run logstash from command line:
/opt/logstash/bin/logstash -f /etc/logstash/conf.d/first-pipeline.conf -l /var/log/logstash/logstash.log --verbose
In this way, I found the output will not actually execute until it receives SIGINT. What's the problem here?

Given you are using multiline it's probably waiting for a CRLF so that it know's the pattern has ended.

I putted an enter character at the end of the file, it's still not work. BTW, I can see the position of the files are written in since_db file. It means logstash already got the input, but for some reason it doesn't handle it in the output. I guess it has bugs in multiline codec or logstash?

Just because something doesn't work the way you expect doesn't mean there are bugs.

Have a read of File input plugin | Logstash Reference [8.11] | Elastic

I have read the file document, but I can't find the solution of my problem. I set the input start_position to beginning, the last modify time of the file not exceed ignore_older value and it has LF in the end. I believe multiline codec is able to pass the input to output, or it will be meaningless. How can I make it happen?

Did you try removing the sincedb?

I found a solution of this issue. Enable auto flush by set a small value to auto_flush_interval in multiline codec. It will push events to output periodically.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.