I need to copy data from old ElasticSearch (v. 1.1.1) to other server running ver. 6.2.
I have no physical access to old server, but it doesn't have any security (X-Pack) installed, so I can easily navigate thru data with REST. I have full root access to destination server running ElasticSearch 6.2.
I'm really confused searching for answers, looks like too many ways to do it: (reindex, backup, elasticdump, etc).
What is the best way to do it?
The source server has single index with 16 different documents mapped, (so "manual" mapping would be complicated).
There has been a number of changes to types and mappings between these versions, so I would suspect you need to migrate mappings before you migrate data by reindexing it.
I'm currently investigating if I can use Logstash on destination server to pull data from old Elasticsearch.
using something like this in Logstash config:
But I still don't understand how to pull all 12 mil documents from source Elasticsearch. Would it be a right approach to run Logstash on schedule and somehow with a "smart query" to pull data piece by piece?
Somehow if I use that config as is (with query line commented ) I always end up with 500 documents in destination index. I probably don't understand the "scroll" part.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.