Hi All,
I'm using logstash 1.5.4-9 with multiline filter to parse and send logs to Elasticsearch 1.7.4. My configuration looks something like:
input {
file {
type => "abc"
tags => "abc"
path => [ "somefilepath" ]
codec => plain { charset => "ISO-8859-1" }
}
}
filter {
if [type] == "abc" {
multiline {
patterns_dir => "somedir"
pattern => "^%{SERVERTIMESTAMP} "
negate => true
what => "previous"
}
mutate {
gsub => ["message", "\n", " "]
gsub => ["message", "\t", " "]
}
grok {
break_on_match => false
patterns_dir => "somedir"
match => ["message", "%{LOG_FM}"]
}
}
}
the output configuration then sends the logs to ES. Intermittently, the logs sent to ES are not complete. The grok pattern defined for these logs throw no error when the log received is complete. The _grokparsefailure tag comes up when only half the log is sent.
I also observed that these incomplete logs are sent when a stacktrace is present, making the log entry too long.
Can someone please suggest what can be the reason of the incomplete logs for my configuration?
Thanks in advance!!