Hello
I am getting syslogs from Clearpass servers and using logstash to ingest them into elasticsearch.
Some of these messages are received on multiple syslog packets, but they are really the same (big) message.
I am trying to combine them (without much success) into one message before sending them to elastic.
My grok entry is as follows:
grok {
match => [
"message", "<%{POSINT:syslog_pri}>%{TIMESTAMP_ISO8601:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program} %{POSINT:syslog_pid} %{NONNEGINT:num_of_msgs} %{NONNEGINT:msg_num} %{GREEDYDATA:message}",
"message", "<%{POSINT:syslog_id}>\.\.\.%{GREEDYDATA:message}"
]
overwrite => [ "message" ]
}
When a message has multiple entries "num_of_msgs" reflects the number of them (1,2,3,4 etc) and "msg_num" is the sequence number for each message (0,1,2, etc).
The common element between all entries is "syslog_id".
The idea is to concatenate all "message" entries with the same "syslog_id" in the proper order, create an entry with that information and submit it to elasticsearch. Also do not want the partial messages in elasticsearch, only the combined one.
I tried this, but it is not doing what I want at all.
if [num_of_msgs] > "1" {
aggregate {
task_id => "%{syslog_pid}"
code => "map['message'] ||=' '; map['message'] +=%{\n}+ event.get('message')"
map_action => "create_or_update"
push_map_as_event_on_timeout => true
timeout => 10
timeout_tags => ['aggregated']
}
if "aggregated" not in [tags] {
drop{}
}
}
I did a lot of google search but I do not really found anything I can follow.
Can you please help me with pointers or suggestions.
Thank you