How should I use sql_last_value in logstash?

@magnusbaeck Yes I did what you asked me to do. Inserted the codec and checked with the debug mode as well.

Part of the output:

[2016-11-02T16:52:00,276][INFO ][logstash.inputs.jdbc ] (0.002000s) SELECT count() AS count FROM (SELECT * from TEST where id > '2016-11-02 11:21:00') AS t1 LIMIT 1
[2016-11-02T16:52:00,279][DEBUG][logstash.inputs.jdbc ] Executing JDBC query {:statement=>"SELECT * from TEST where id > :sql_last_value", :parameters=>{:sql_last_value=>2016-11-02 11:21:00 UTC}, :count=>0}
[2016-11-02T16:52:00,287][INFO ][logstash.inputs.jdbc ] (0.003000s) SELECT count(
) AS count FROM (SELECT * from TEST where id > '2016-11-02 11:21:00') AS t1 LIMIT 1
[2016-11-02T16:52:00,582][DEBUG][logstash.pipeline ] Pushing flush onto pipeline

What should I be checking on with the output ? I'm cracking my head with this. :confused:

Can't I use an id of a table in order to update the index with the new records which are added? I tried it with the date and datetime field and it works perfectly fine. But then I have to work around with the id.