Snapshot Recovery on a Separate ECE Setup

I have an ECE 2.3 setup which has gone down due to a damaged zookeeper. Because of this, I set up a new ECE and created a couple of new ES cluster deployments on the new ECE, also on version 2.3.

How do I restore data from the previous deployments to the new ES clusters on the new ECE?

On the Snapshot recovery UI, I am only able to select from deployments created on the new ECE and not from the previous one. The snapshots are stored on Minio.

Our support isn't ideal here - in order to clone the cluster from the snapshots you would have to use the API, and bring up the cluster with 2 key fields under transient - plan_configuration.extended_maintenance to true (to avoid eg Kibana adding indices during the restore process) and restore_snapshot - the reference docs describe the format, but basically repository_config.raw_settings and restore_payload.raw_settings give you direct access to the ES API for repo creation and restore respectively.

Alternatively you can bring up the cluster from the UI, disable routing on all nodes, close all indices and then use the console API to perform the same steps manually

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.