What is the best way to copy Elasticsearch index to other server?

I need to copy data from old ElasticSearch (v. 1.1.1) to other server running ver. 6.2.

I have no physical access to old server, but it doesn't have any security (X-Pack) installed, so I can easily navigate thru data with REST. I have full root access to destination server running ElasticSearch 6.2.

I'm really confused searching for answers, looks like too many ways to do it: (reindex, backup, elasticdump, etc).

What is the best way to do it?

The source server has single index with 16 different documents mapped, (so "manual" mapping would be complicated).

There has been a number of changes to types and mappings between these versions, so I would suspect you need to migrate mappings before you migrate data by reindexing it.

I'm currently investigating if I can use Logstash on destination server to pull data from old Elasticsearch.
using something like this in Logstash config:

input {
elasticsearch {
hosts => "es.production.mysite.org"
index => "mydata-2018.09."
# query => '{ "query": { "query_string": { "query": "
" } } }'
size => 500
scroll => "5m"
docinfo => true
}
}
output {
elasticsearch {
index => "copy-of-production.%{[@metadata][_index]}"
document_type => "%{[@metadata][_type]}"
document_id => "%{[@metadata][_id]}"
}
}

But I still don't understand how to pull all 12 mil documents from source Elasticsearch. Would it be a right approach to run Logstash on schedule and somehow with a "smart query" to pull data piece by piece?
Somehow if I use that config as is (with query line commented ) I always end up with 500 documents in destination index. I probably don't understand the "scroll" part.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.