Pickup Updates from ElasticSearch Index and send to Kafka

Hello Team,
Need your guidance for my problem

Requirement:-
we push the updates into our ES Index from our source, and we want to pick up those updates and further send to Kafka topic using kafka output plugin

Question: How to pick up only the updates and send, is there a way to identify the updates(new record update, update to existing record update, delete event update)

Please help guide

The Solution i am looking for is:-
To only pick up UPDATES not the whole of index data everytime we schedule and send to kafka topic - below data sends all of data every time we send

Kafka output plugin code

input{
elasticsearch{
hosts => "xxxxxxxxxxxx01:9201"
index => "testcaseindex"
size => 1000
scroll => "5m"
docinfo => true
}
}

output{
kafka{
codec => json
topic_id => "POC.CASE.TOPIC"
bootstrap_servers => "XXXXXXXX01.XX.XXXX.com:9092"
}
}

RUN with below
bin/logstash -f logstash-filter.conf

This is a Logstash question, so please do not open multiple threads for the same question.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.