JDBC Plugin - sql_last_value equals a not existing timestamp

Hey there,

I recently began to use the JDBC Plugin, which will be used to copy data (log data to be specific) from a table within an oracle database. Data is written there every few minutes later on and I just want to update the elasticsearch instance with every new row. While this seems to be working fine, I got 1 issue, which should be easily solveable, although I dont know how, yet.

The value for sql_last_value equals the timestamp, when the query was executed. But I would like it to be the latest date in my table.
For example, this a query, extracted from the log file:
SELECT * from Log WHERE LOG_TIME > TIMESTAMP '2017-08-10 11:38:00.253000 +00:00'

But the newest data row is placed around march this year. With every minute, the query is executed again, and the value from sql_last_value is adding one minute.

This is my configuration:
jdbc {
jdbc_connection_string => "jdbc:oracle:thin://@DB:PORT:SID"
jdbc_user => "USER"
jdbc_password => "PASSWORD"
jdbc_driver_library => "/home/logadmin/Downloads/ojdbc.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
tracking_column => "LOG_TIME"
tracking_column_type => "timestamp"
statement => "SELECT * from ISI_Log WHERE LOG_TIME > :sql_last_value"
type => "db"
schedule => "*/1 * * * *"
}

Is this even possible to archieve?

I think you need to add use_column_value => true

When doing that, I get the following message after every query:
[2017-08-31T15:07:00,308][INFO ][logstash.inputs.jdbc ] (0.000000s) SELECT * from ISI_Log WHERE LOG_TIME >= TIMESTAMP '1970-01-01 01:00:00.000000 +01:00'
[2017-08-31T15:07:00,308][WARN ][logstash.inputs.jdbc ] tracking_column not found in dataset. {:tracking_column=>"LOG_TIME"}

The time never changes and is always from the year 1970.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.