Hi,
I am trying to redirect my postgresql tables to elasticsearch using JDBC and Logstash. I was able to do it but i came across a problem of ingesting the same data over and over again. I know i need to use tracking column but i think i have a problem with timestamp format.
This is my conf file:
input {
jdbc {
clean_run => true
jdbc_driver_library => "/tmp/postgresql-42.6.0.jar"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_connection_string => "jdbc:postgresql://postgres:5432/template1"
jdbc_user => "postgres"
jdbc_password => "******"
statement => "SELECT * FROM public.users WHERE last_update > :sql_last_value AND last_update < CURRENT_TIMESTAMP ORDER BY last_update ASC"
use_column_value => true
tracking_column_type => "timestamp"
tracking_column => "last_update"
last_run_metadata_path => "/tmp/last.yml"
#schedule => "* * * * *"
}
}
filter {
mutate {
}
}
output{
elasticsearch {
hosts => ["******""]
index => "sql_idx"
user => "elastic"
password => "******"
ssl => true
ssl_certificate_verification => false
}
}
But the problem is, logstash log still gives me:
WHERE last_update > '1970-01-01 00:00:00.000000+0000'
The out.yml file keeps the correct record:
2023-10-20 12:10:06.789306000 Z
But it seems like :sql_last_value does not ingest it correctly. Can anyone help me with this? Thank you in advance.