Backup ELK data on a URL

Hi,
Is there any way that I can take the snapshot of Elastic search data and push it to an URL (http) like an JFROG Artifactory (https://jfrog.com/artifactory).
Thanks !

Yep - https://www.elastic.co/guide/en/elasticsearch/reference/6.2/modules-snapshots.html#_read_only_url_repository

Not really. URL repo is read only.

Thanks for your reply @dadoonet

Okay, does it mean, we can only get the snapshots from URL, but no way to post the snapshot on the URL ?

I've found this https://www.elastic.co/blog/found-elasticsearch-snapshot-and-restore

curl -XPUT https://-eu-west-1.foundcluster.com:9243/_snapshot/myRepo -d'
{
"type": "s3",
"settings": {
"bucket": "myBucket",
"region": "eu-west-1",
"base_path": "myCluster"
}
}'

Am trying something similar... wanted to post the snapshots on the artifactory and read it from there when needed. Please let me know. Thanks !

I don't know how Artifactory works and what are the available API there.
But we only support HDFS, Shared FS, S3, Azure and GCS as standard platforms to make snapshot to.

Anything else has to be built as a Java Plugin IMO.

Some people are doing Snapshots to S3 then exposing the S3 bucket as available on HTTP and are using this URL repository as a readonly one (so they can restore from the URL).

May be there is a way to sync a directory with Artifactory (like Rsync?) In which case you could create a shared FS repository, snapshot to it, sync to artifactory?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.