Logstash not updating last run metadata file

In my Logstash I want to download from a huge database table. To avoid overload, I tried to get data of each day separately (since 20 years) and copied it to elasticsearch. I used :sql_last_value as follow, but it does not update!!

input {
jdbc_driver_library => "/opt/driver/mssql-jdbc.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://host:port;databaseName=ZENmydb"
jdbc_user => "username"
jdbc_password => "password"
jdbc_default_timezone => "UTC"
schedule => "* * * * *"
statement => "SELECT filed1, filed2, filed3, filed4, DATEADD(day, 1,:sql_last_value) as lastupdate FROM table WHERE field3 between :sql_last_value AND DATEADD(day, 1,:sql_last_value)"
use_column_value => true
tracking_column => lastupdate
tracking_column_type => "timestamp"
last_run_metadata_path => "/opt/config/sql_last_value.yml"
}

The content of sql_las_value_.yml --> --- !ruby/object:DateTime '2000-01-01 00:00:00.000000000 Z'

I got always the values between '2000-01-01 00:00:00' and '2000-01-02 00:00:00'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.