Logstash 2.2 :sql_last_value

hi my input jdbc filter is like that

file: simple-out.conf

input {
jdbc {
jdbc_driver_library => "C:/elastic/logstash-2.2.1/bin/ojdbc14.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:@//192.168.XX.YY:1521/dbname"
jdbc_user => "user"
jdbc_password => "pass"
# schedule => "* * * * *"
clean_run => false
record_last_run => true
statement => "SELECT * from contact WHERE contact_id > :sql_last_value"
use_column_value => true
tracking_column => contact_id
}
}
output {
elasticsearch {
index => "adress1"
document_type => "adress"
hosts => ["localhost:9200"]
}
}

when i run it without tracking options and not the :sql_last_value it rune ok but import all each time
want to import only new contact where the contact_id (number not date) is greater than the previous greater

the things is that what is transmitted to database is always a time stamp not a value (should be 0 first time and after the previuos max )
and i always got a ora-0932 saying that he expect a number not a timestamp
how to say it's a 0 not a date?

thanks

found it myself

i had done a previous run with the use_colum_value to false so date was stored in last value
i had to do a clean run so timestamp value was deleted and so value was init to 0 and no more import errors

hope it will help others

2 Likes