Update existing record in elasticsearch while ingesting data through logstash pipeline

Hi, I have a requirement where I need to ingest data from mysql db into elasticsearch, for this i have made use of logstash jdbc input plugin.
My data is in large volume and i need to implement incremental ingestion now which i can achieve by making use of sql_last_value but i'm facing issue while achieving below requirements with incremental ingestion.

  1. Need to run a check if record already exists in elasticsearch, if yes then i need to update value of a field "status" to "inactive" to mark that record as inactive and then ingest the new record with new updated fields and value of "status" as "active".
    For eg. This is the existing record in elastic index
    ID Name City Status
    1 Raj Goa active

The person updates his city and now again the new record should be ingested but the record history should be maintained so we'll need both the records in elastic index but only the latest one should have active status.

New records to be ingested :
ID Name City Status
2 Rahul Bihar active
1 Raj Bihar active

Final records in elastic:
ID Name City Status
1 Raj Goa inactive
2 Rahul Bihar active
1 Raj Bihar active

I need to prepare a logstash pipeline to achieve this scenario. Kindly help me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.