Hello everyone!
Filebeat 7.6.1 -> logstash 7.6.1
Filebeat reads logfiles, processes multiline and sends to logstesh. (all OK)
But at the time of file rotation, the filebeat cuts the message into two parts, the first in the rotated file and the second in the new file. I could not get around this problem.
For the solution I want to use the filter Aggregate on the Logstesh.
But I don't know much about Ruby and have never used this filter.
I searched the Internet for an example: https://stackoverflow.com/questions/51421183/filebeat-logstash-multiline-syslog-parsing
Using the pattern, I can define the first message and the second. I used the path to the file as an identifier.
I can not use multiline logstesh because many sources. Using one worker is also not acceptable, so you have to define the first and second message with patterns, insert them into an array, and then convert them to a string.
if [message] =~ /pattern_first_message/ {
aggregate {
task_id => "%{file_path}"
code => "map['message'] ||= []; map['message'].unshift(event.get('message'));"
push_map_as_event_on_timeout => true
timeout => 10
timeout_tags => "_aggregate"
timeout_code => "event.set('message', map['message'].join(' '))"
}
} else if [message] =~ /pattern_second_message/ {
aggregate {
task_id => "%{file_path}"
code => "map['message'] ||= []; map['message'].push(event.get('message'));"
}
}
When processing such a filter, I get an error:
Aggregate exception occurred {:error=>#<NameError: undefined local variable or method map' for`
Could you help fix my filter?