How to sync database changes to Elastic search using logstash

I am completely naive to ELK stack. My requirement is I am creating indexes on ES using logstash via JDBC plugin. I am indexing DB table to ES. If DB tables modified or new data inserted then only modified and inserted rows should be indexed in ES. example an Employee table contains 1000 records. Logstash indexed it by named empRec in ES. After some time 5 records of 1000 get modified in DB then only 5 records under empRec index should be modified immediately. IF let say 15 employee gets added then It should be reflected ES indexed also.

How to achieve this in ELK stack? In brief whenever underlying DB changes corresponding ES index should get updated modified.

I am using the latest version of ELK stack i.e 7.0.1

You can use the upsert option for the elasticsearch output but then you have to define the document_id to a value that remains consistent (like a key).

Thanks for replying Anders,

My requirement is when DB tables updates then It should be identified automatically and corresponding ES index should also be updated. In short there should be same copy of DB tables is maintain under ES index.

If you want ES to know when a row of a database has changed, it can either fetch all the rows periodically and see which are different, or you can use a TRIGGER on the DB side to update ES when a row changes.

Thanks Badger, but I have millions of records in tables and keep on adding or updating.

I am expecting some optimise solution.


If you want to mirror your changes in elastic in real time the other solution can be overwriting your model save and send your data in elastic.
The main drawback is that you need to change your code for each model you want to save, it can take time, add bug etc... depend on your application.

Depend on your language there's maybe already lib to help like this one for django:

Hope it can help