Is there any way to reduce the size of data before indexing into elasticserch

As our data size is quite big, we wish to reduce the size of the record. Is it possible to do some kind of aggregation before indexing into ES in logstash? For example, it is possible to put only the record of the max value of specific field over 1 min into ES index ? Or just put one of every 10 records into ES index?

Yep, take a look at the aggregate filter.

Alternatively, you can ingest the data and then use rollups in Elasticsearch - Rolling up historical data | Elasticsearch Guide [7.12] | Elastic

1 Like

Thank you for quick reply. Let me have a look.
By the way, is it possible to accomplish the same thing on filebeat to reduce the network loading?

Looking at the Filebeat docs, it doesn't look like it.

That is a use-case specifically addressed in the documentation.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.