Elasticsearch Index Field Deletion Issue with Logstash

Hello everyone,

I’m new to the ELK stack and need help with an issue regarding field deletion in Elasticsearch.

I have a source table with 10 records that I want to reflect in Elasticsearch. I'm using Logstash to pull these records every 59 minutes. However, during the day, the number of records can change (e.g., if someone is deleted). When Logstash runs and pulls the updated data (let's say it pulls only 9 records now), I want Elasticsearch to reflect this change so that the index shows exactly 9 documents.

My goal is to always maintain the 'current state' of the records from the source table in Elasticsearch. However, I can only perform read queries on the source database, meaning I can't modify it with triggers, soft deletes, or any similar approaches.

Could anyone guide me on how to achieve this using Logstash and Elasticsearch? Any suggestions or best practices would be greatly appreciated!

The logstash configuration that work for update and added fields:

input {
jdbc {
clean_run => true
jdbc_driver_library => "D:/METSO/Elastic Kibana - version 2/logstash-8.15.1/logstash-conf/mysql-connector-j-9.0.0.jar"
jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
jdbc_user => "root"
jdbc_password => "root"
schedule => "*/2 * * * *"
use_column_value => true
tracking_column => "created_date"
tracking_column_type => "timestamp"
statement => "SELECT * FROM person WHERE created_date > :sql_last_value"
}
}

output {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "person_index"
user => "elastic"
password => "xxxxxxxxxx"
ssl => true
ssl_certificate_verification => false
document_id => "%{id}"
}

stdout {
    codec => rubydebug
}

}

Thank you!