Missing events using JDBC input

To stream events from the database of a device that logs events in milliseconds using the logstash JDBC input. The input file is scheduled to update the index every second with new entries into the DB.

The issue with this is I notice some events from the source are missing from Elasticsearch. Events that happen within the millisenconds before logstash reruns seem to be missing.

I would like to know if there is a better way to stream realtime events using logstash jdbc input when using timestamp based column tracking..

 input {
jdbc {
    jdbc_connection_string => "jdbc:vjdbc:rmi://192.168.65.14:2000/VJdbc,eqe"
    jdbc_user => "username"
    jdbc_password => "password"
    jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/vjdbc.jar,/usr/share/logstash/logstash-core/lib/jars/commons-logging-1.1.jar"
    jdbc_driver_class => "com.device.vjdbc.VirtualDriver"
    jdbc_default_timezone => "Etc/UTC"
    schedule => "* * * * *"
    clean_run => false
    last_run_metadata_path => "/usr/share/logstash/config/.lastrun.txt"
tracking_column_type => "timestamp"
    statement => "SELECT  connection_sec, dst_ipaddr, dst_port, src_ipaddr, src_port FROM event WHERE connection_sec>:sql_last_value"

}
}

Perhaps you should use connection_sec >= :sql_last_value instead of connection_sec > :sql_last_value? Then there's a risk of fetching the same row twice but that's a better problem to have.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.