Same data keeps on adding which results in duplicate data while using jdbc to fetch data from the mysql server

how can i avoid same data to pass and add up to the previous data creating redundancy

for example when I add doc A and B, and then later when I add C, then doc A, B again along with C gets added to the elastic search thus creating A,B,A,B,C......

input {
jdbc {
jdbc_driver_library => "/home/sachin/jdbc/mysql-connector-java-5.1.47/mysql-connector-java-5.1.47-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/pepipayment"
jdbc_user => "root"
jdbc_password => "Root@123"
schedule => "* * * * *"
statement => "SELECT * from pepi_pay"
}
}

filter {
mutate {convert => ["trid", "integer"]}
mutate {convert => ["clientid", "integer"]}
mutate {convert => ["amount", "integer"]}
mutate {convert => ["invoice1", "integer"]}
mutate {convert => ["status", "integer"]}

}

output {
elasticsearch {
hosts => "localhost"
index => "trans_pay"
document_type => "pay"
}
stdout {
codec => rubydebug
}
}

If your database has either a sequence column or a timestamp that indicates the last update time then the jdbc input can handle state. If you search the logstash forum for sql_last_value you should be able to find some examples.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.