Hello,
I'm using the logstash 5.1.1 to get the data from Oracle DB 11g to elastic search, the connection and data fetching is working fine but the problem is with the synchronization between the DB and Elasticsearch, every time the job will run the statement and the data will be duplicated into elastic, how can i make it synchronized? to take the new inserted, updated and remove deleted records dynamically while the job is running?
here is the config file for the JDBC connection:
input {
jdbc {
jdbc_driver_library => "pat of (ojdbc6.jar)"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@DB_IP:Port:DB_ServiceName"
jdbc_user => "db_schema_name"
jdbc_password => "db_schema_password"
jdbc_validate_connection => true
statement => "SELECT * FROM table_1"
}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "Index_1"
document_type => "record"
workers => 1
}
}
Thanks.