Duplicate entries when using jdbc pipeline

Hi,

I am getting duplicate entries when I use my jdbc pipeline below:
input {
jdbc{
jdbc_driver_library => "/opt/jdbcjars/mariadb-java-client-2.3.0.jar"
jdbc_driver_class => "org.mariadb.jdbc.Driver"
jdbc_connection_string => "jdbc:mariadb://testhost:3306/otrs"
jdbc_validate_connection => true
jdbc_user => "logstash-user"
jdbc_password => "logstash-password"
schedule => "*/5 * * * *"
statement => "SELECT * FROM ticket_info WHERE change_time > :sql_last_value ORDER BY ticketid ASC"
use_column_value => true
tracking_column => "change_time"
tracking_column_type => "timestamp"
clean_run => true
tags => ["jdbc","otrsdb"]
last_run_metadata_path => "/opt/jdbc-test/.logstash_jdbc_last_run"
id => "jdbc"
type => "jdbc"
}
}
filter{
}
output {
if [type] == "jdbc" {
elasticsearch {
hosts => ["https://elastichost1:1223","https://elastichost2:1223","https://elastichost3:1223"]
index => "logstash-%{+YYYY.MM.dd}"
}
}
}

I am getting 3 entries in elasticsearch for each database entry.
document_id => "%{[@metadata][_id]}"
But it didn't work and this is what I got

{
"_index": "logstash-2019.03.07",
"_type": "doc",
"_id": "%{[@metadata][_id]}",
"_version": 1302,
"_score": null,

Have you run logstash with the configuration more than once?

Yes. I am using centralised pipeline management.

I believe you should only run with that once.

I tried with that option, But I am still getting 3 entries for every single record read from my db.

I thought the document_id => "%{[@metadata][_id]}" option will fix it but its now working either.

I am using Elastic 6.3.1

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.