I recently deployed ELK stack for my application performance monitoring. Now that I have huge volume of logs created, I want to aggregate the logs to the minute and purge huge amount raw logs. This way I could hold on to aggregate data for longer time to create trend analysis.
I could create a standalone java program to do this, but looking for any standard way to achieve data aggregation instead of writing custom code.
The standard way is to keep the data and use ES's aggs.
If you want to roll up the data, there is no standard way.
Aggregating logs often requires custom logic as the logic can differ significantly depending on the type of logs and analysis that the aggregated log documents need to support. Because of this there is no standard, out-of-the box way to do it. In some cases this can be done based on aggregations, but more complex log aggregation often requires the use of scan and scroll while performing the actual aggregation in a client side application.