Offline Elastic Restore From GCS

I needed to shift a sizeable amount of data from GCS repository. Directly from the Google Console, I downloaded the Elastic snapshots to a local external hard drive.

With a fully functioning local Elastic instance, where I keep other data, I designated the repository and attempted to restore having installed the GCS plugin and keystore information.

Here is the problem:
The downloaded snapshots can only be registered a a file share FS. When I try to designate them as GCS repository it refuses to recognize locally stored files as a GCS repository. However, when I designate a GCS repository I can only input the cloud based file path. The eliminates the entire point of downloading these files into an offline local environment.

When I try to restore the snapshots from the local environment, the restore fails stating "daily snap XYZ123" from Google is missing.

Did I error in downloading the snaps directly from the Google Console to a local environment and is there a way to restore on a local environment GCS Elastic Snaps in this situation without having to redownload everything? Given the file size this is not at all an irrelevant consideration. This is a very large data even when compressed into snapshots.

Welcome to our community! :smiley:

Can you share a reproduction of what you are doing, eg step by step? Don't worry too much about commands, I just want to make sure I am following your logical steps.

Sure thanks for the help.

  1. Downloaded Elastic snapshots taken in the Google Cloud to a local machine.

  2. Installed GCS plugin and Google creds into keystore.

  3. So to take a brief pause: snapshots now reside in a Google Cloud instance AND on a local external hard drive. The snaps on the local external hard drive were downloaded directly via the Google Cloud Console.

  4. When I try to restore the snapshots from the local files via creating a FS repository, I am notified that:

Unable to restore snapshot

[Local:daily-snap-2021.04.18-xyz_123/2BpjnxckRP6E7h1B65r8Ow] is missing

Because it is identified as a FS local path, I do not believe it is recognizing it is a needs to interact with the GCS plugin. I should also note that the missing "daily-snap..." is recognizeable as the "xyz_123" file path which lists "2Bpjn..." as the UUID. Not sure if there is some error there but something I noticed.

  1. When I try to create a repository with GCS plugin, it pushes me into Google Cloud paths and credentials rather than my locally stored snapshots. When I try to verify the Google Cloud repository using the instructions here: Configure a snapshot repository using GCS | Elasticsearch Service Documentation | Elastic

I receive the following error:

"name": "ResponseError",
"meta": {
"body": {
"error": {
"root_cause": [
"type": "repository_verification_exception",
"reason": "[Google] path is not accessible on master node"
"type": "repository_verification_exception",
"reason": "[Google] path is not accessible on master node",
"caused_by": {
"type": "access_control_exception",
"reason": "access denied ("" "C:\WINDOWS\system32\config\systemprofile\AppData\Roaming\gcloud\active_config" "read")"
"status": 500

I am wondering now if I erred in downloading the data to my local environment directly from the Google Cloud Console.

Any help would be appreciated. It is a very large amount of data so I would really like to not have to redownload the data or if we can link the Google instance to the already downloaded files so I can restore the local files. Appreciate it

I've been studying the issue over the weekend and I think I need to link the FS (shared file system) with the Google Cloud in some manner. Right now the snapshots on the local file system labeled FS are not recognizing they are snapped on a Google Cloud and are consequently not restoring. The Google Cloud wants me to restore the snaps on the Google Cloud. I've looked at a couple of options to mount a drive between the local environment and the Google Cloud. Appreciate your insight.

Yes, if you are trying to read snapshots stored in a filesystem (not GCS) then you should use a repository with type fs (not gcs). I think I'd expect what you describe to work, assuming that the files on disk are a faithful copy of the contents of the bucket, but this is not a workflow that is covered by the test suite nor have I tried it myself so you are venturing well into uncharted territory.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.