I would really appreciate some advice on how the following could be achieved:
For each day I am creating new indices to hold the data collected Logstash from that day and each index contains the date in its name. Now, I need to find a way do deal with the old indices, but rather than archiving them, I want to aggregate some of the values in them so that I still have representation of the data just not with fine granularity.
For example, one of the fields contains the current SWAP usage for one of the systems which is gathered every 10 seconds or so. Currently the old indices are as granular as the newly created ones. Therefore, I want to aggregate the SWAP values from the old indices so that instead of having the value every 10 second to have it for every hour, which to be the result of average aggregation over the values withing that one hour.
Then, as far as I see it, I should have the result of this aggregation stored in new index so that it can be searchable/visualized with Kibana.
What I have so far is a simple query that uses date range aggregation with 1 hour interval for particular time frame (lets say 12 hours) from an index that is one month old. So far so good, but now the question is how can I do the same with 30 indices instead of one and once I get the aggregate data what I should do in order to use it in Kibana.
I hope the usecase makes sense . Any hints are greatly appreciated.