We are using ELK stack on windows server 2016. ELK stack has version 7.16.2.
We are sending data from filebeat to logstash using pipeline and the pipeline har grok plugin.
On sending the data we are getting the error in logstash:
Timeout executing grok '(?<parsedtime>%{MONTHNUM}/%{MONTHDAY}/%{YEAR} %{HOUR}:%{MINUTE}:%{SECOND})\s*\t%{DATA:process} \(%{DATA:processcode}\)\s*\t%{DATA:tid}\s*\t(?<area>[^\t]*)\s*\t(?<category>[^\t]*)\s*\t%{WORD:eventID}
\s*\t%{WORD:level}\s*\t(?<eventmessage>.*)\t%{UUID:CorrelationID}?' against field 'message' with value 'Value too large to output (49839 bytes)! First 255 chars are:
The pipeline configuration is:
if "GO ULS" in [tags] {
grok {
match => { "message" => "(?<parsedtime>%{MONTHNUM}/%{MONTHDAY}/%{YEAR} %{HOUR}:%{MINUTE}:%{SECOND})\s*\t%{DATA:process} \(%{DATA:processcode}\)\s*\t%{DATA:tid}\s*\t(?<area>[^\t]*)\s*\t(?<category>[^\t]*)\s*\t%{WORD:eventID}
\s*\t%{WORD:level}\s*\t(?<eventmessage>.*)\t%{UUID:CorrelationID}?" }
}
date {
locale => "en"
match => ["parsedtime", "MM/dd/YYYY HH:mm:ss.SS"]
}
}