Hello ,
We implemented a solution to push Microsoft SQL Data in form of simple rows from SQL Server to Logstash which will index the data to Elasticsearch. The data (SQL rows and columns) are getting successfully by using a simple jdbc plugin to logstash and logstash indexing the data to elasticsearch. But, everytime we try to pump the data (same data) what its doing is adding duplicate records with same data
input {
jdbc {
jdbc_driver_library => "/etc/logstash/sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://172.17.1.102;user=sa;password=root;"
jdbc_user => "sa"
jdbc_password => "root"
statement => "SELECT * FROM hlidb.dbo.inf"
type => "pers"
}
jdbc {
jdbc_connection_string => "jdbc:postgresql://192.168.10.100:5432/postgres" # The user we wish to execute our statement as jdbc_user => "postgres" jdbc_validate_connection => true jdbc_driver_library => "/etc/logstash/postgresql-42.0.0.jar" # The name of the driver class for Postgresql jdbc_driver_class => "org.postgresql.Driver" # our query statement => "SELECT * from contacts" type => "contact" } }
filter {
}
output {
elasticsearch {
hosts => "192.168.10.150"
index => "sqltab"
document_type => "%{type}"
document_id => "%{uid}"
}
stdout { codec => rubydebug }
}