Hello, I have SIEM-like logs (with a unique identifier field) in a log flow (two or more logs, in every update/log there's extra info or updaded info). I need to preserve the last log or aggregate all the logs into one with only the non-repeated info or new fields. How can I do this? I know there's the aggregate plugin but I don't think it's for this scenario.
Can I merge all the logs to one using aggregate filter? I'm not quite understanding the plugin, in my pipeline I have various filters and mutates to my data (my events come one after the other) Can I keep them? I just want to add the extra fields that each log adds as an update. My thought was not do make a script to curl to Elasticsearch but if there's no other option I can take it.
EDIT: My use case is similar to the explained here: