JDBC plugin reloads everything, everytime

Hello All.

I am trying to get data from a SQL Server BD with a very simple query. For some reason everytime I restart Logstash the whole data gets loaded into Elasticsearch.

The marker file seems OK, the very last ID is contained within the file but I don't get why everything gets extracted each time.

Anyone seeing something not right with my input configuration?

Thanks!

input {

jdbc {
jdbc_driver_library => "C:\logstash\drivers\mssql-jdbc-6.2.2.jre7.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://9.9.9.9:52824;instanceName=MyInstance;databasename=MyBD"
jdbc_user => "logstash"
jdbc_password => "SomePassword"
record_last_run => true
use_column_value => true
last_run_metadata_path => "C:\logstash\SQL\marker.txt"
statement => "SELECT evt.event_id, Field1, Field2, Field3 FROM MyTable evt ORDER BY evt.event_id"
tracking_column_type => "numeric"
tracking_column => "event_id"
tags => "KAS"
}
}

Your query always fetches all rows; it doesn't reference the sql_last_value parameter to restrict the selection of rows to only those it haven't processed before.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.