Multiple events processing and runtime fields

Hello there,

I am currently working on how to raise an alert in Kibana in case a field's value is identical across two different events.

Here's how my setup is configured :

Network traffic -> Suricata -> log file -> Filebeat -> Elasticsearch -> Kibana.

The log file I'm working with is a JSON file using the following format :

{"IP_header":
{
"timestamp" : "07/11/2023-13:43:42.836803",
"srcip" : "169.254.190.221", 
"dstip" : "236.6.7.8",
"dstport" : "6678",
"packet_type" : "image"
},
"IMG_header":
{
"avg_monobit" : "0.00067108869552612",
"avg_paire" : "0.014432289870456",
"hash" : "419074b7ae4ec05778d9dfc97726f6cf820190fd"}
}

I'd like to be able to detect when the IMG_header.hash value does not change across several adjacent events (at least 2). I've thought about using a boolean runtime field that would be set to 1 if IMG_header.hash is the same as the previous line, then setup an alert when this field is set to 1, but couldn't get it to work.
I've also thought about configuring Filebeat to forward the log to Logstash instead of Elasticsearch, then using multiple lines event processing, but also couldn't get it to work.

What's the best way to achieve what I'm trying to do ?

If this type of event processing can be done using ELK, I would also like to raise an alert in case IMG_header.avg_monobit of the current event gets above or below a certain percentage of the average of that field for the last 10 events. Is that also something that can be done with the ELK stack or does that go beyond its limit ?

Thanks

Hi @Thibadu!

A runtime field wouldn't work for this use case, since you need to calculate a result across multiple documents. Runtime fields work fields of a single document.

I'm not sure what the best way would be, but one way would be to use a transform. You can have a transform that defines a pivot which searches your data, aggregates it in a way that exposes duplicates. You might need to use a scripted metric aggregation in the pivot to achieve this.

Note that using a pivot does not alter the original data. It will convert your index into a new "summarized" index which opens up more kinds of analysis.

See:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.