I'm importing data from a postgresql database to elasticsearch using logstash.
My input looks like:
input {
jdbc {
jdbc_connection_string => "***"
jdbc_user => "***"
jdbc_password => "***"
jdbc_driver_library => "/path/postgresql-42.1.4.jar"
jdbc_driver_class => "org.postgresql.Driver"
use_column_value => true
tracking_column => "updated_at"
tracking_column_type => "timestamp"
clean_run => true
last_run_metadata_path => "/path/dump_shops.logstash_jdbc_last_run"
statement => "
select
'shop' as type,
id_catalog_shop id_shop,
fk_catalog_company,
'company' shop,
address,
lon,
lat,
updated_at,
active,
'category' as category,
to_json(opening_hours::text)#>>'{}' as opening_hours
from catalog_shop
WHERE updated_at > :sql_last_value
"
}
}
When I'm running the corresponding conf file using this command:
sudo /usr/share/logstash/bin/logstash -f /path/dump_shops.conf --path.data /path/dump_shops/
The script is being executed and I can see that the :sql_last_value is equal to its default value (jan 1st 1970) instead of the last updated_at value :
select
'shop' as type,
id_catalog_shop id_shop,
fk_catalog_company,
'company' shop,
address,
lon,
lat,
updated_at,
active,
'category' as category,
to_json(opening_hours::text)#>>'{}' as opening_hours
from catalog_shop
WHERE updated_at > '1970-01-01 00:00:00.000000+0000'
On the other hand my dump_shops.logstash_jdbc_last_run is well being updated:
vim /path/dump_shops.logstash_jdbc_last_run
--- 2018-04-12 09:53:26.040731000 +00:00
I tried to delete dump_shops.logstash_jdbc_last_run but still, when I'm running the script :sql_last_value is still equal to jan 1st 1970 while dump_shops.logstash_jdbc_last_run is well being updated