I have been provided a mongoDB export .json file by a client, and they want o load this into elasticsearch (ES) to perform testing. They have a process currently in place to extract data from MySQL and load it into mongo, so I expected there to be an easy way to take a mongo extract and load it into ES
I tried installing river with the mapper attachment, but I have now learned that "river" is deprecated. I have ES2.2.0 installed and running (with Kibana), and I have loaded data from MySQL into the node. I also have mongoDB2.6 installed and functional, and the 2 backups have been imported. CentOS7 instance.
I have created the mappings and analyzers I need for ES, and I just need to load the mongoDB data into it. it is 2 indexes that are very large. Surely there is some easy, straight-forward way to load the 2 mongoDB .json files into ES; or, do I need to create custom code to do this?
I would like for logstash to be the solution here, but I cannot get access to the production server where mongo resides to see and understand the setup I need for FileBeat.
If someone would just point me in the right direction it would be much appreciated.