Hi, I used logstash jbdc to get data from oracle db for a year. Last week, my company change db server and my jdbc pipeline does not work correctly anymore.
As you see, I set my pipeline start at 8 a.m and run every 5 minutes. When its first run of day, my sql_last_value value is --- 2024-09-25 07:56:58.390960000 +00:00
. Any there is no data because of wrong timezone.
But If I delete last_run_metadata file and let it run again, sql_last_value will be --- 2024-09-25 02:14:48.139529000 Z
, this is not correct timezone but my pipeline run fine (my timezone is UTC +7 and timestamp data from db is 2024-09-25 09:14:48).
Summary:
- First run of day, my sql_last_value value is
--- 2024-09-25 07:56:58.390960000 +00:00
-> no data because of UTC +7, it will be 2024-09-25 14:56:58 - Delete last_run_metadata and run again, sql_last_value will be
--- 2024-09-25 02:14:48.139529000 Z
-> UTC +7 will be 2024-09-25 09:14:48
input {
jdbc {
jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/ojdbc6-11.2.0.3.jar"
jdbc_driver_class => "Java::oracle.jdbc.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@10.xx.xx.xx:1521/xxxx"
jdbc_user => xxxx
jdbc_password => xxxx
statement_filepath => "/etc/logstash/conf.d/database/sql/orders.sql"
use_column_value => true
tracking_column => "last_change"
tracking_column_type => "timestamp"
last_run_metadata_path => "/etc/logstash/conf.d/database/last_run/orders"
schedule => "*/5 8-15 * * 1-5"
type => "orders"
tags => "order"
}
}