Hello
In case of high events rate, if I have multiple events (with predefined order) and I want to copy field from the first event to the subsequent events using logstash elasticsearch filter
Is there a way to guarantee that the first event is being indexed before copying its field to the subsequent events?
because I have tried this and I have noticed the next events are not being updated all the times with this field (some are updated and some are not)
I have tried using aggregate filter but in case the events rate is not high (maybe the subsequent events will come hours after the first one as this is not predictable), it would not be the best approach to use aggregate filter and using the elasticsearch filter is better
Thanks