Value too large to output (49839 bytes) logstash error

We are using ELK stack on windows server 2016. ELK stack has version 7.16.2.

We are sending data from filebeat to logstash using pipeline and the pipeline har grok plugin.

On sending the data we are getting the error in logstash:

Timeout executing grok '(?<parsedtime>%{MONTHNUM}/%{MONTHDAY}/%{YEAR} %{HOUR}:%{MINUTE}:%{SECOND})\s*\t%{DATA:process} \(%{DATA:processcode}\)\s*\t%{DATA:tid}\s*\t(?<area>[^\t]*)\s*\t(?<category>[^\t]*)\s*\t%{WORD:eventID}
\s*\t%{WORD:level}\s*\t(?<eventmessage>.*)\t%{UUID:CorrelationID}?' against field 'message' with value 'Value too large to output (49839 bytes)! First 255 chars are:

The pipeline configuration is:

if "GO ULS" in [tags] {
grok {
match => { "message" => "(?<parsedtime>%{MONTHNUM}/%{MONTHDAY}/%{YEAR} %{HOUR}:%{MINUTE}:%{SECOND})\s*\t%{DATA:process} \(%{DATA:processcode}\)\s*\t%{DATA:tid}\s*\t(?<area>[^\t]*)\s*\t(?<category>[^\t]*)\s*\t%{WORD:eventID}
\s*\t%{WORD:level}\s*\t(?<eventmessage>.*)\t%{UUID:CorrelationID}?" }
}
date {
locale => "en"
match => ["parsedtime", "MM/dd/YYYY HH:mm:ss.SS"]
}
}

Matching a pattern in grok can be really slow (resulting in a timeout) when the pattern fails to match. If you are trying to match the entire line then add ^ at the start of the pattern to anchor it to start of line. If you can replace any of those DATA with NOTSPACE that will also speed things up.

Thanks a lot for your input, we have not changed it to GREEDYDATA

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.