Restore snapshots on a different environment

Hey All,
I want to take a snapshot and restore on a different environment.
What are the steps I should follow?
I already created the snapshots on the original environment.
I have multiple files created on each node (I have several elastic nodes).

  • I have indices directory created on each node.
  • On the master node, I have other files and folders created.
    2020-03-27 14_20_45- - Remote Desktop Connection

What's next?
Should I copy the files to the new environment? I am not sure what should be done next.
What are the critical files? Or all of them are critical?

Thank you!

If you used the snapshot API you just need to use the restore API. You don't need to manually do anything with files.

Hi @dadoonet
Thanks for the reply.
This is a completely different environment.
So I cannot restore the snapshot as the new environment is not connected to the same repository or even the same network.


Taking and restoring shapshots at the file system level is not supported and will not work in newer versions. You should use the snapshot api to create the snapshot into some shared storage mounted on all the nodes. This snapshot can then be transferred to a different environment, mounted onto all the nodes in that cluster and restored.

Hi @Christian_Dahlqvist
Thanks for the reply.
That is exactly what I am trying to do.

I am asking about the technical way to do so.
Should I merge directories with same IDs?
Simply copy and restore?
Anything I need to know?


You create the snapshot onto shared storage using the snapshot API. Once it has completed you can transfer it as it is. Do not make any changes to it. Then mount it again as shared storage onto all nodes in the destination cluster, configure the repository and restore.

Hey @Christian_Dahlqvist
One point is
In the original environment I have 5 nodes, while in the destination I have 1 node.
I have duplicate directories names across the nodes.
One directory on node A has only dat file, while its duplicated directory on node B has also the indices folder and files.
If I want to move it to another repository, I will have conflicts.


It sounds like you are looking directly at the file system of the nodes and not creating a snapshot using the API I linked to. The API will handle this and take a cluster wide snapshot which can be restored to a cluster of different size without problems. Restoring based on direct file system snapshot the way you seem to describe is not supported and will not work.

Hi @Christian_Dahlqvist
I am creating a snapshot with the API.
It is created successfully and can be restored as well in the same environment.
Restoring with API or via Kibana UI is working just fine.

I would like, however, to restore the snapshot on another cluster. In another network.

How can I do that? If at all?

Thank you

You need to move the full snapshot that you too to the new environment. If it is a single node cluster you can mount it directly but if you have several nodes you need to use shared storage. Then configure a repository that points to this location and use the restore API. You naturally need to set up the repo path in your config just as you did on the cluster where you created the snapshot.

I have configured a repository on each node.
Lets call it repo_folder.
So I have 5 X repo_folder, 1 for each node. Not 1 shared storage.
That means that I cannot just copy all repo_folders into the new env as is.
So, I guess it is a bad configuration to start with?!

How should I configure the repository is such a case? Or must I have a shared storage?


Yes, shared storage is required. The API should not work without it so I am not sure how you have managed to create a snapshot using it.

If you have internet access there is also a plugin that allows you to take snapshots to AWS S3, which could remove the need to move the snapshot around.

Hi @Christian_Dahlqvist
Thanks for the explanations and your patients.
I am not sure how it works, but the snapshot API works just fine with my settings.
However, it creates a different issue as discussed in this thread.
I will try to create a new repository on a shared storage.

Thank you again

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.