I want to migrate my existing postgres db data to elasticsearch. And after the migration also want to update the elasticsearch with new update to the database
I shared most of my thoughts there: http://david.pilato.fr/blog/2015/05/09/advanced-search-for-your-legacy-application/
Basically, I'd recommend modifying the application layer if possible and send data to elasticsearch in the same "transaction" as you are sending your data to the database.
Can i use logstash ?
Yes with the jdbc input plugin.
Okay Thanks, Can I have realtime sync using Logstash ?
No you can't with the jdbc input.
Can it do incremental updates? E.g. you rung the jdbc every hour and then it pulls all the data from postgres from the last hour into Elastic?
I'm looking at the same problem. In my case old documents won't change though, just need to pull in the new data at x interval.
Yes. As long as you have something like a
lastUpdated field in your database, you can probably select all the records
WHERE lastUpdated > "now". Then if you are generating elasticsearch document's
_id from one of the fields of your database entity, Elasticsearch will overwrite the existing version by the new one.
Can you suggest something that will help me in keeping the ES sync with my Relational DB
You should read the link David posted.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.