Multiple Instances of Logstash JDBC input

My Requirement is to fetch updated/new records from the table. I used the sql_last_value to fetch the latest the records. I want to run logstash in multiple instances to avoid os/hardware crashes and get continuous updates. But sql_last_value is updating in each instance separately this is causing duplicate data or same query running multiple times with same time stamp. Is anyone tried to read sql_last_value from centralized location for multiple instances to avoid redundant queries to DB?

I used the following JDBC input:

input {
  jdbc {
    jdbc_connection_string => ""    
    jdbc_user => ""
    jdbc_password => ""
    # The path to downloaded jdbc driver
    jdbc_driver_library => "/app/logstash/jdbc/ojdbc6.jar"
    jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
    # The path to the file containing the query
    statement => "SELECT * FROM TEST1 WHERE modifyts > :sql_last_value"
    schedule => "*/5 * * * *"
    use_column_value => false
    #tracking_column => "MODIFYTS"
    lowercase_column_names => false
    last_run_metadata_path => "/centralized-location/last-run/logstash_jdbc_last_run_test1_log"
    record_last_run => true
    sql_log_level => "debug"
    #tracking_column_type => "timestamp"
    jdbc_default_timezone => "America/Los_Angeles"
    id => "test1"
  }
}

Any suggestions?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.