Persistent values between events from the same filebeat source file

I need some help with an issue I am having. It is not necessarily a Logstash issue, but rather an issue with the processing logic. I am using Filebeat to send the logs to Logstash for further processing. The log lines are not multiline.

An example log file looks like this:
"XY51661R 200 SOME TEXT More Info / extra-string AAAA 00:20 all 2:55 Page 1 "
"From user : test user filenumber . . . : aaaa/bbbb/cccc 23.11.2018 1.0 "
"------------------------------------------------------------------------------------------------------------------------------------ "
"Series 1 / 2 Prog ProgM Start End Duration Status "
"------------------------------------------------------------------------------------------------------------------------------------ "
" 300 200 ABC0400C Main 2:55:20 2:55:22 0:02 Normal termination "
" 300 300 ABC0200C Secondary 2:55:22 2:55:30 0:08 Normal termination "
"End of print 8'888 "

So since the logs are being sent on a per-line basis, single log lines do not contain all of the required information. The second line contains the current date of the job (23.11.2018). I need to somehow "save" this value to use across all incoming events for the current logfile. I have managed to semi do this, by tagging each line based on the grok match. The match for the second line containing the date adds a unique tag to the line (2headline). When normal job events are matched (" 300 300 ABC0200C Secondary 2:55:22 2:55:30 0:08 Normal termination "), the date value is retrieved by the Elasticsearch filter plugin and added to a new field in the job event.

This approach has a major flaw:
The events/log lines are being processed asynchronously, so I would need to make sure the file is sent top-to-bottom and processed by Logstash in the same order.

How would one do this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.