Is Logstash JDBC tracking_column meant to work with BIGINT?

I have a Logstash JDBC configuration with tracking_column set to a BIGINT of the format YYYYMMDDhhmmsss

This is obviously a very big number and the value set in last_run_metadata_path is always set to zero.

If I use a different BIGINT tracking_column which has small numbers it works as expected.

Is this a bug?

Here is my config:

input {
	jdbc {
		jdbc_driver_library => "C:\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"
		jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
		jdbc_connection_string => "jdbc:sqlserver://sql;databaseName=database;username=username;password=password;"
		jdbc_user => "username"
		jdbc_password => "password"
		schedule => "* * * * * *"
		statement => "select top(1) lastupdated from table where cast(lastupdated as bigint) > :sql_last_value order by lastupdated asc"
		use_column_value => true
		tracking_column => "lastupdated"
		tracking_column_type => "numeric"
		last_run_metadata_path => "C:\logstash-7.14.0-windows-x86_64\logstash-7.14.0\config\sql_last_value.yml"
	}
}
output {
	stdout { codec => json }
}

The sql_last_value.yml file just contains --- 0

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.