Hi, I wish to transfer data from Elasticsearch into a Mongodb instance. The data in Elasticsearch is nearly 120 GBs and is continuoly updating with nearly 4GB per day. How do I do it?
What I have tried:
- Logstash is very slow and also the problem is that on stopping logstash it restarts from the beginning again (https://github.com/logstash-plugins/logstash-input-elasticsearch/issues/93)
- Scroll Api is good for migrating large data, but what about real time data? The documentation doesn't recommend using it for real time data.
- Pagination is very slow, and it might work but in lot of discussions it is not recommended for documents in the size of millions. I have tried in less number of documents and it works fine but how do I make sure it will work on a scale too?
Any suggestions are welcome.