Duplicating Events using JDBC

Hello there, I'm using jdbc input plugin to create an index, but if I run the logstash config, close and open again, it duplicates the events on index.

This is my script:

input{
   jdbc{
     jdbc_connection_string => "jdbc:oracle:thin:@//********:*****/******"
     jdbc_user => "USER"
     jdbc_password => "PASSWORD"
     jdbc_validate_connection => true
     jdbc_driver_library => "C:\sqldeveloper\jdbc\lib\ojdbc8.jar"
     jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
     statement => "SELECT * FROM SONAE_090_CONTRATOS"
     use_column_value => true
     tracking_column => "id"
     schedule => "0 11-14 * * * "
   }
}

filter {
}
output{

  stdout{
    codec => rubydebug
  }
  elasticsearch{
    hosts => "localhost:9200"
    index => "sonae"
  }

}

How can I do to just update the index?

Use a fingerprint filter to set the document_id option on the elasticsearch output.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.