JDBC input plugin only works outside of pipeline - 6.4.2

Please note: I am able to run this configuration successfully in Logstash 6.4.2 using the command logstash -f simple.conf but not using pipelines.
indent preformatted text by 4 spaces

  • Version: 6.4.2

  • Operating System: Ubuntu 16.04

  • Config File (if you have sensitive info, please remove it):

    input {
    jdbc {
    id => "throughput"
    jdbc_connection_string => "jdbc:db2://:50000/MYDB"
    jdbc_user => "jdbcuser"
    jdbc_password => ""
    jdbc_driver_library => "/root/db2jcc4.jar"
    jdbc_driver_class => "com.ibm.db2.jcc.DB2Driver"
    jdbc_validate_connection => "true"
    statement => "SELECT * FROM MYSCHEMA.MYTABLE"
    }
    }

    output {
    stdout { codec => json }
    }

  • Sample Data:

    [2018-10-18T13:18:15,153][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
    Pipeline_id:throughput
    Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"jdbcuser", jdbc_validate_connection=>true, jdbc_password=>, statement=>"SELECT * FROM MYSCHEMA.MYTABLE", jdbc_driver_library=>"db2jcc4.jar", id=>"throughput", jdbc_connection_string=>"jdbc:db2://:50000/MYDB", jdbc_driver_class=>"com.ibm.db2.jcc.DB2Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_2f20236b-bdac-48b6-adb5-d01767cfba1a", enable_metric=>true, charset=

    "UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.
    5, parameters=>{"sql_last_value"=>1969-12-31 16:00:00 -0800}, last_run_metadata_path=>"/usr/share/logstash/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false,
    record_last_run=>true, lowercase_column_names=>true>
    Error: com.ibm.db2.jcc.DB2Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
    Exception: LogStash::ConfigurationError
    Stack: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in open_jdbc_connection' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in execute_statement'
    /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in execute_query' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in run'
    /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:409:in inputworker' /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:403:in block in start_input'

  • Steps to Reproduce:

  1. Add the pipeline configuration to pipelines.yml (example below):

    • pipeline.id: throughput
      path.config: "/etc/logstash/conf.d/simple.conf"
  2. Update logstash.yml to set log.level: error

  3. service logstash start

  4. Check log messages to view the error.

EDIT:
I found my own solution. I had to copy the db2jcc4.jar file to the location of JARs that Logstash will add to the CLASSPATH on startup.

  1. Copied db2jcc4.jar to /usr/share/logstash/logstash-core/lib/jars
  2. chmod 644 db2jcc4.jar
  3. chown <logstash user>:<logstash group> db2jcc4.jar

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.