"Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?"

Hello, I'm going in circles and I'm having no luck getting the mysql-connector to work with 7.3.1.
Now I'm using two separate Docker code bases but the method should be the same

I have a working model with 6.5.1 using
FROM docker.elastic.co/elasticsearch/elasticsearch-oss:6.5.1

When I switch to using 7.3.1 with
FROM docker.elastic.co/elasticsearch/elasticsearch:${ELK_VERSION}
from this code base: https://github.com/deviantony/docker-elk.git

No matter what I do, I can't seem to get logstash to connect to the same database that the 6.5.1 setup can connect to.

I even tried changing the code base that does work to just pull from the newer docker images.

6.5.1 - This works!

version: '3.3'

services:

  logstash:
    build:
      context: logstash/
    volumes:
      - ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
      - ./logstash/pipeline:/usr/share/logstash/pipeline:ro
      - ./mysql-connector-java-8.0.17/:/usr/share/mysql-connector-java-8.0.17
    ports:
      - "5000:5000"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    networks:
      - elk
    depends_on:
      - elasticsearch

logstash.conf

input {
        jdbc {
               jdbc_connection_string => "jdbc:mysql://10.xx.xx.xx:3306/<mydbiuse>"
               jdbc_user => "<username>"
               jdbc_password => "<pwd>"
               jdbc_driver_library => "/usr/share/mysql-connector-java-8.0.17/mysql-connector-java-8.0.17.jar"
               jdbc_driver_class => "com.mysql.jdbc.Driver"
               jdbc_default_timezone => "UTC"
               statement => "select  i.organization_id,o.name,nc.*,sv.* from cwp_secureviews sv INNER JOIN ncc_collaborations nc ON (sv.id = nc.secure_view_id)INNER JOIN identities i ON (i.id = sv.identity_id)  inner join organizations o ON (o.id = i.organization_id)"
         }
}

output {
        stdout { codec => json_lines }
        elasticsearch {
          hosts => ["elasticsearch:9200"]
          index => "metadata-sql"
        }
}
logstash_1       | Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
logstash_1       | [2019-10-07T16:16:01,695][INFO ][logstash.inputs.jdbc     ] (0.046080s) select  * from mytable

7.3.1 - doesn't work

version: '3.2'

services:
  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
      - type: bind
        source: ./mysql-connector-java-8.0.17
        target: /usr/share/mysql-connector-java-8.0.17

logstash.conf

input {

        jdbc {
               jdbc_connection_string => "jdbc:mysql://10.50.0.11:3306/<mydbiuse>"
               jdbc_user => "<username>"
               jdbc_password => "<pwd>"
               jdbc_driver_library => "/usr/share/mysql-connector-java-8.0.17/mysql-connector-java-8.0.17.jar"
               jdbc_driver_class => "com.mysql.jdbc.Driver"
               jdbc_default_timezone => "UTC"
               statement => "select  * from mytable"
         }


}

output {
        elasticsearch {
                hosts => "elasticsearch:9200"
                user => "elastic"
                password => "<pwd>"
        }
}

Here are the errors:

logstash_1       | [2019-10-07T15:53:51,527][ERROR][logstash.inputs.jdbc     ] Failed to load /usr/share/mysql-connector-java-8.0.17/mysql-connector-java-8.0.17.jar {:exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>}
logstash_1       | [2019-10-07T15:53:51,594][ERROR][logstash.javapipeline    ] A plugin had an unrecoverable error. Will restart this plugin.
logstash_1       |   Pipeline_id:main
logstash_1       |   Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"ncryptedcloud", jdbc_password=><password>, statement=>"select  * from my table", jdbc_driver_library=>"/usr/share/mysql-connector-java-8.0.17/mysql-connector-java-8.0.17.jar", jdbc_default_timezone=>"UTC", jdbc_connection_string=>"jdbc:mysql://10.50.0.11:3306/ncryptedcloud", id=>"9fb188d8b6556966f23c69770d6c3de8ae362dba70680cdea43dc4914f580af0", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_0961b98b-23fb-42b3-94a9-dea167b76c15", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>#<DateTime: 1970-01-01T00:00:00+00:00 ((2440588j,0s,0n),+0s,2299161j)>}, last_run_metadata_path=>"/usr/share/logstash/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
logstash_1       |   Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
logstash_1       |   Exception: LogStash::ConfigurationError
logstash_1       |   Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in `open_jdbc_connection'
logstash_1       | /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in `execute_statement'
logstash_1       | /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in `execute_query'
logstash_1       | /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in `run'
logstash_1       | /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:309:in `inputworker'
logstash_1       | /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:302:in `block in start_input'

Any thoughts?


[EDIT: @yaauie added code fences to make it easier to read]

It seems like you have several changes in-flight simultaneously, which makes helping you find a path forward a bit tricky:

  • changing the version of Logstash you're using across major versions
  • switching from the official docker images to images provided by a third party
  • changing how you're mounting your volumes to your docker containers

Additionally, in your 6.x logs, there is a message indicating that the driver class you are specifying is deprecated and that you should use a different class name moving forward; it is possible that an upgrade to Logstash, Java, or to the third-party packaging is causing the JDBC input plugin to no longer fall-through to the legacy driver class name:

Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'.

this is what I have for 7.3.1 and it works

input {
    jdbc {
        jdbc_validate_connection => true
        jdbc_connection_string => "jdbc:mysql://rpt001:3306/rpt001_db?zeroDateTimeBehavior=convertToNull"
        jdbc_user => "phpuser"
        jdbc_password => "phpuser"
        jdbc_driver_library => "/root/mysql-connector-java-5.1.47/mysql-connector-java-5.1.47.jar"
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        statement => "select tid,  reported from disk"
        clean_run=>true
        schedule => "10 13 * * *"
       }

}

Here is what I ended up doing to make it work.

I copied the MySQL connector jar file to the logstash directory then added an ADD command
ADD mysql-connector-java-8.0.17.jar logstash-core/lib/jars/
to my Dockerfile

in my pipeline/logstash.conf I blanked out the following parameters:

jdbc_driver_class => ""
jdbc_driver_library => ""

This is some sort of work around and is supposedly fixed but has yet to be included in Logstash from what I've been told.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.