Handling duplicates from SQL Dumps/json and API response in Logstash

use scheduler, so that it will update the data for every 1 minute as shown below,

The input code is fine, in the output did you mention elasticsearch to get the result,

Try this,

input {

jdbc {
jdbc_driver_library => "xxxx\oracle-10g\ojdbc14.jar"
jdbc_driver_class => "oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@localhost:1521:DATABASE"
jdbc_user => "ROMAINROM"
jdbc_password => "ROMAINROM"
statement => "SELECT TOP 10 * FROM TABLE"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
schedule => "*/1 * * * *"
}

}

output{
elasticsearch { codec => json hosts => ["localhost:9200"] index => "index9" }
stdout { codec => rubydebug }
}