Logstash error to finding tracking_column in jdbc input for mongodb

Hello,
I using logstash jdbc for pulling data from mongodb and pushing to elasticsearch. I using this pipeline configuration for logstash:

input {
  jdbc {
    type => "bb_purchaselog"
    jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mongojdbc3.1.jar"
    jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
    jdbc_connection_string => "jdbc connection string"
    jdbc_user => dbuser
    jdbc_password => "pass"
#    jdbc_paging_enabled => true
#    tracking_column => "transaction_time"
    last_run_metadata_path => "/usr/share/logstash/lastrun/bb-purchaselog-lastrun.yml"
#    use_column_value => true
#    tracking_column_type => "numeric"
    schedule => "1 */1 * * * *"
    statement_filepath => "/usr/share/logstash/query/bb-purchaselog.js"
  }
  .
  .
  .
}

My configuration works correctly but when I enabling tracking_column and use_column_value features, logstash jdbc returns some error about "can't finding tracking_column value"

I have three question:

  1. can I using tracking_column and use_column_value feature in jdbc input plugin when I using mongojdbc3.1.jar?

  2. If I can't using these two features, is there way to set :sql_last_value value in input or filter stage in logstash config?

  3. If we havn't any solution for two above questions, can you describing any risks about data loss when we not using use_column_value feature?

Thanks for any help.

Any response? :frowning_face:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.