Group/update events in Elastic

Hello,

I post a question about a use case we have on our ELK architecture. We are setting up an infrastructure with Logstash + Elasticsearch + Kibana. The same event (same value for a given ID field but another different fields) can be ingested several times. The question is that from business we want to see the events with the same id grouped in a single event.

We had thought to use an elastic transform to create another index that has the data with all the fields that have ever had the same event (same id). The problem is that this would require a lot of storage that we don't have. Is there any other alternative? The ideal would be to have an index with a single event summary of all those that have entered; as if when an event enters an insert or update is done.

We are quite lost, thank you very much!

In the end we are going to make an elastic pipeline that does update and thus take the ingestion out of the elastic cluster. As for the storage, we will take the unclustered data to cold.