Reg. Pushing MongoDB data to Elasticsearch

Hi there,
I have been trying to integrate my existing MongoDB with
Elasticsearch for quite some time but with no luck. The goal is to add
enhanced search and report generation( for which i am planning on using
Kibana) on existing production data stored in MongoDB clusters.
Approximately 80Gb data is available in my Databases.
I am looking for the best method by which i can push existing
data as well as real-time data onto ES.
I already tried the MongoDB-river plugin for ES by Richard
Louapre but am not satisfied with its performance. Firstly it doesn't even
pull my entire MongoDB collection onto ES. And my data has nested
objects..and this plugin doesn't support that either. Is this happening due
to incorrect config from my side? Or are you aware of any instances where
this plugin has worked smoothly with high volumes of data ( Approximately
6-7 rivers running at the same time talking to several other MongoDB
collections with Gigabytes of data)?
Any alternate plugin or wrapper which can work with this volume
of data or best method to go ahead with this will be highly appreciated.

Thanks,
Priyanka

Rivers have been removed.

You can try logstash. I can see that there is a community plugin to read from MongoDB. Never tried it myself.

The other solution is to write your own code which reads Mongo and write to elasticsearch using bulks.

For real time aspects, may be you could consider doing that from the application which generates data? Write in both systems at the same time? Or use a message broker like Redis/Kafka/RabbitMQ in the middle?

My 2 cents