Offloading old data to cheaper storage

We are currently using elastic to aggregate logs with filebeats. We want to be able to offload data that is older then 60 days and then store in AWS s3. The data does not need to be searchable but we may need to retrieve the data 6 months or 1 year from now. The data once retrieve can be loaded into another cluster or the current one. Which ever is easier. Can someone point me to a blog or documentation? Any advice would be appreciated.

1 Like

We are working towards a solution here called searchable snapshots.

In the meantime, you can take snapshots (ie backups) of your indices, store them in S3 and then manually restore when you need them.

1 Like

Note if you are using time-based indices (like per day/week), you can snapshot just part of your cluster, i.e. some indexes - the ones older than 60 days.

That can be separate (different repo) from the normal full-cluster snapshots I hope you are doing already.

Then later build a big VM with one node ES cluster and restore that cold snap to it, search, etc. all your like; it'll take a while but is nice & cheap solution.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.