Mongodb input plugin for logstash missing data problem

Hi, I'm using logstash-input-mongodb to import my data from mongodb to ELK.
But when it was done, I found that there is one document missing (I have 17781 docs in mongodb but only got 17780 docs in ELK).

I used the "normal" way to install the plugin through command bin\logstash-plugin install logstash-input-mongo.
So can anyone help me on this problem?

Also, I found that when there is a change on the source db in mongo, this plugin can not detect it and import new data. And I didn't found any schedule settings (like jdbc input plugin) which allow me to call my data regularly. So can anyone tell me how to keep data updated with mongodb using this plugin?

Thank you!

@magnusbaeck @val
Hi, experts. Could you please give me some advice?

It appears from this line:

return collection.find({:_id => {:$gt => last_id_object}}).limit(batch_size)

that the plugin only finds documents based on increasing object_id, meaning it will not see modified documents.

I see you have created an issue on the github repo.

First of all, thank you for your replying me.
I've also tried to add a document manually, and the plugin still didn't react to my work.

P.S. the post on Github is also created by me. :wink:

Hi @guyboertje, I also saw a pull request #60 trying to solve this bug, but seems no update till now.

It would be already clear that the missing doc is the first one, namely, the one with the smallest _id.
I currently solve this problem by adding a placeholder doc to take the first place, which works but not convenient I think.

Moreover, I have also tried to add a new doc, whose _id would be the largest, while logstash is importing through mongo input plugin. The result is logstash didn't react to the new doc. Do you know how to solve?

Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.