Hi Everyone,
I am new to elasticsearch. I tried a lot but I am unable to find the way to send already present data(collections) of mongodb to elasticsearch. I successfully established the connection between mongodb and elasticsearch using "mongoosastic" plugin. Able to send the newly entered data to elastic from mongodb.But my question is "how could I send the collections which are already present in mongodb"??
I came to know that .synchronize() method of mongoosastic would index all the data already existing.
But it is taking time when there is huge collection which contains 1-2 million documents.
Can any one suggest what would be the alternative to do this?
Hello,
Even i am facing the same problem and i am following this way,
Using logstash, we will be querying the data into elasticsearch.
So write a query to split the data like
1st query(to load from 1 - 1000000 rows) data from database
2nd query(to load from 1000000 - 2000000 rows) data from database
...
...
...
which will helps to load complete data without any disturbances.
Thank you very much.
But we are not using logstash in our project .
So can you suggest another way through which i can send data from mongodb to elasticsearch.
Thanks.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.