Elastic Cloud Log Archiving Recommendations

We recently moved to the Elastic Stack on the Elastic Cloud but we have a very minimal setup and would like to ask for guidance and recommendations as to how to go about archiving logs without doing anything unnecessarily expensive.

So we have 1 node that does all the Elastic Search processing, it has 2 GB of RAM and 60 GB of space. We're already using 40 GB and would like to know how we can go about archiving the older logs instead of just deleting them. What would be the best way to go about this?

You have 2 options (as per my point of view) :

  • Enable best_compression on older indices, this will help to reduce used space
  • Or do snapshot into an S3 bucket and delete old indices

Your other option is to setup a hot/warm/cold cluster and then migrate data onto cold nodes.

Thanks for the reply, where can I find an article that explains how to setup this "best_compression"?

PUT /index_name/_setting
{
    "index": {
      "codec": "best_compression"
    }
}
1 Like

I noticed that I cannot execute this on an index that is already open. How is this meant to be used?

Thanks for the reply. I noticed now the options of adding a warm / cold cluster in the cloud. I noticed that the costing on a cold cluster is quite large compared to the hot node. If I don't query old data enough, will that mean I pay less overall?

This can be donne only when creating new index
Use it in an ILM for example
Or reindex old data into a new index with this seting

I don't suppose I can run a task on ILM or something on Elastic Cloud that can do the re-indexing for me while enabling the best_compression option? All I can think of is actually running a long-running script to do that for me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.