Logstash read previous events fields from elasticsearch in case of high events rate

Hello

In case of high events rate, if I have multiple events (with predefined order) and I want to copy field from the first event to the subsequent events using logstash elasticsearch filter

Is there a way to guarantee that the first event is being indexed before copying its field to the subsequent events?

because I have tried this and I have noticed the next events are not being updated all the times with this field (some are updated and some are not)

I have tried using aggregate filter but in case the events rate is not high (maybe the subsequent events will come hours after the first one as this is not predictable), it would not be the best approach to use aggregate filter and using the elasticsearch filter is better

Thanks

logstash generally does not preserve event order. To retain the order you will need to set pipeline.workers to 1. Also, pipeline.ordered will need to be true or auto (the default). In future versions (8.x) auto may no longer be the default, then you will have to set it to true.

Also, the logstash pipeline works in batches, so a group (by default 125) events go through a filter, are passed to the next filter, and so on until they reach the output. So you may need to set pipeline.batch.size to 1.

Even then there is no guarantee that the output will index the event before the next event is flushed to the pipeline.

Thanks for the reply

The reason I want to do this is that I want to display the following in a data table in kibana:

if I have two events of this format

event1 => { "id" => "1", "shared_field" => "shared_value", "field" => "test_value1" }
event2 => { "id" => "1", "field" => "test_value2" }

and "shared_field" is common (shared) for all of the events having a specific "id" but it's only present in the first event for that id

I want to show in the data table the following

id        shared_field              field
1         shared_value            test_value1           => event1
1         shared_value            test_value2           => event2

and the only way I could think of it is to denormalize the events (copy "shared_field" into the next events)

Is there a way to display in this format in Kibana given this format of the events?

Try post processing with logstash