Grokparsefailure in processing a log file

While processing a large log file with logstash, we are getting grokparsefailure & dateparsefailure. We would like to know which line in the log file is causing the failure so that we can look into more details with grok debugger. A line number or the log message itself would help.

Currently, in the output filter we do this ::

if ("_grokparsefailure" in [tags]) {
file {
path => "./error.txt"
}
}

Is there a way to identify the line # or the offending message itself?

Thanks in advance,

Indrajit Mitra

If you are sending them to elasticsearch then just do a search for documents with those tags. Otherwise send the event to another output only if it has the tag, just as you show.

1 Like

Thanks for your quick response. Based on the first suggestion, we sent those to elasticsearch (irrespective of failure). There is only one line that is causing the failure (please see below). However from that, we could not identify which log entry in the file is causing the problem. How can we get that information please?

Doesn't your document have the [message] or [event][original] fields?

We don't see those fields. Please see the image from kibana posted above.

This is how we are sending to elasticsearch in the output filter:

output {
elasticsearch {
hosts=> "myhost:7201"
index => "%{[type]}-%{logTimeStamp}"
document_id => "%{g_t_id}-%{t_id}"
doc_as_upsert => true
action => "update"
retry_on_conflict => 5
user => "elastic"
password => "xxxxx"
}

}

Please ignore the previous message. message field was being dropped thru' remove_field. We will try un-removing that and re-run our test. Thanks !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.