Logstash API,Logstash doesn't update the data when mongodb data gets updated or delete records

As records are updated, modified and deleted in our MongoDB transactional database, those documents Logstsah API will not fetching the records in ES

logstash: 8.5.1
Elasticsearch:8.9.1

Is there an option? What would be the best way to use MongoDB, Logstash and Elastic?

Hi @anguri_sudhakar,

Welcome back. How are you currently pulling the records from MongoDB to Elasticsearch? Are you using the JDBC input plugin? Or an alternative approach.

Thanks for quick response :slightly_smiling_face: @carly.richmond

Here is my config file:
input {
mongodb {
codec => "json"
uri => 'mongodb://mongo-admin:mongo_admin'
placeholder_db_dir => '/data/logstash-mongodb/'
placeholder_db_name => 'xx_elk_dev.db'
collection => 'elk_xxx_data'
batch_size => 50000
}
}

filter {
mutate {
#remove_field => [ "_id" ]
rename => { "_id" => "mongo_id" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch {

      hosts => ["http://xx.xx.xx.xx:9200"]
           user => "elastic"
           password => "xxxx"
           #index => "employee-%{+YYYY.MM.dd}"
           index => "elk_xxx_data"
           document_id => "%{_id}"

}
stdout { codec => rubydebug }
}

That is correct. The mongodb input looks for items with an _id value larger than the last _id it retrieved. The _id field is the immutable identifier of a document, so if the document is updated then the _id does not change. The mongodb input will never see it. Similarly for deletions.

You may be able to implement the tracking of updates in mongodb using change streams, but that would be outside of logstash.

2 Likes

Thanks for update @carly.richmond @Badger