I am using jdbc plugin to fetch data from postgresql db, it seems to be work fine for first time and i am able to pull the data, but it is not working according to saved state, everytime all of data is queried and there are lot of duplicates.
I checked the .logstash_jdbc_last_run. The metadata state is updated as required, still plugin is importing entire data from table on every run. If any thing wrong in config.
Thanks for update @guyboertje I removed the clean_run => true, but still it is not running according to the saved state. i have checked again .logstash_jdbc_last_run is saving the last run from specified column.
Now you actually have to use the id in your sql statement.
Something like...
statement => "select id,timestamp,distributed_query_id,distributed_query_task_id, \"columns\"->>'uid' as uid, \"columns\"->>'name' as name from distributed_query_result where id > :sql_last_value order by id asc"
I am not sure about the double quotes around columns though.
Maybe use single quotes in the LS config and double quotes around the columns.
statement => 'select id,timestamp,distributed_query_id,distributed_query_task_id, "columns"->>"uid" as uid, "columns"->>"name" as name from distributed_query_result where id > :sql_last_value order by id asc'
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.