Logstash and MariaDB won't connect (both dockerized)

Hello everyone,

newby here, quite excited to learn about elastic and the community.
Some help would be great.

I'm getting an error when JDBC plugin attempts to connect to mariaDB through the following settings:

mariaDB 10.1.14

172.17.0.2:3306
It's in a docker-container, if it matters.

JDBC:

I've tried the following:

  • mariadb-java-client-2.1.0.jar
  • mariadb-java-client-1.3.2.jar
  • mysql-connector-java-5.1.43-bin.jar

Elastic Stack

- docker-compose.yml:

version: '2'

services:

elasticsearch:
build: elasticsearch/
volumes:
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk

logstash:
build: logstash/
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
- ./logstash/pipeline/logstash.conf:/usr/share/logstash/pipeline/logstash.conf
- ./drivers/:/opt/logstash/vendor/jar/jdbc/
ports:
- "5001:5001"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk
depends_on:
- elasticsearch

kibana:
build: kibana/
volumes:
- ./kibana/config/:/usr/share/kibana/config
ports:
- "5601:5601"
networks:
- elk
depends_on:
- elasticsearch

networks:

elk:
driver: bridge

- logstash/Dockerfile:

FROM docker.elastic.co/logstash/logstash:5.5.2
RUN /opt/logstash/bin/logstash-plugin install logstash-input-jdbc

- logstash.yml:

http.host: "0.0.0.0"
path.config: /usr/share/logstash/pipeline
xpack.monitoring.enabled: false

logstash.conf:

input {

jdbc {
jdbc_driver_library => "/opt/logstash/vendor/jar/jdbc/mariadb-java-client-2.1.0.jar"
jdbc_driver_class => "org.mariadb.jdbc.Driver"
jdbc_connection_string => "jdbc:mariadb://172.17.0.2:3306/heaven"
jdbc_user => "root"
jdbc_password => "root"
jdbc_validate_connection => true
statement => "SELECT * FROM entities"
}
}

output {
stdout { codec => json_lines }
elasticsearch {
hosts => "elasticsearch:9200"
index => "entities"
}
}

What happen:

sorry. I passed 7000 characters

... really sorry

What happen:

I'm printing here only what I get from logstash, although I'll be quite happy to give elasticsearch and kibana's configuration and log if you think it's relevant.
I've tried an input{} by exec{} a JSON file and it seems to work fine.

logstash_1 | ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
logstash_1 | Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash_1 | [2017-08-17T22:05:18,729][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://elasticsearch:9200/]}}
logstash_1 | [2017-08-17T22:05:18,733][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
logstash_1 | [2017-08-17T22:05:19,488][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#Java::JavaNet::URI:0x301c8f4f}
logstash_1 | [2017-08-17T22:05:19,488][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
logstash_1 | [2017-08-17T22:05:19,724][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
logstash_1 | [2017-08-17T22:05:19,735][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>[#Java::JavaNet::URI:0x20752163]}
logstash_1 | [2017-08-17T22:05:19,739][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
> logstash_1 | [2017-08-17T22:07:30,932][WARN ][logstash.inputs.jdbc ] Failed test_connection.
> logstash_1 | [2017-08-17T22:07:30,937][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
logstash_1 | Plugin: <LogStash::Inputs::Jdbc jdbc_driver_library=>"/opt/logstash/vendor/jar/jdbc/mariadb-java-client-2.1.0.jar", jdbc_driver_class=>"org.mariadb.jdbc.Driver", jdbc_connection_string=>"jdbc:mariadb://172.17.0.2:3306/heaven", jdbc_user=>"root", jdbc_password=>, jdbc_validate_connection=>true, statement=>"SELECT * FROM entities", id=>"5d3bb6f3d8d3e12b5ce2372e6633ce9c0dfcf283-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_e8c1c6c5-9b30-438b-bd53-e01844af2c18", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/usr/share/logstash/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
logstash_1 | Error: undefined method `close_jdbc_connection' for #Sequel::JDBC::Database:0x5508a385
> logstash_1 | [2017-08-17T22:09:41,995][WARN ][logstash.inputs.jdbc ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseConnectionError: Java::JavaSql::SQLNonTransientConnectionException: Could not connect to address=(host=172.17.0.2)(port=3306)(type=master) : Connection timed out (Connection timed out)>}
logstash_1 | [2017-08-17T22:09:41,996][WARN ][logstash.inputs.jdbc ] Attempt reconnection.
logstash_1 | [2017-08-17T22:11:53,067][WARN ][logstash.inputs.jdbc ] Failed test_connection.
logstash_1 | [2017-08-17T22:11:53,071][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
logstash_1 | Plugin: <LogStash::Inputs::Jdbc jdbc_driver_library=>"/opt/logstash/vendor/jar/jdbc/mariadb-java-client-2.1.0.jar", jdbc_driver_class=>"org.mariadb.jdbc.Driver", jdbc_connection_string=>"jdbc:mariadb://172.17.0.2:3306/heaven", jdbc_user=>"root", jdbc_password=>, jdbc_validate_connection=>true, statement=>"SELECT * FROM entities", id=>"5d3bb6f3d8d3e12b5ce2372e6633ce9c0dfcf283-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_e8c1c6c5-9b30-438b-bd53-e01844af2c18", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/usr/share/logstash/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
logstash_1 | Error: undefined method `close_jdbc_connection' for #Sequel::JDBC::Database:0x1d59d8a2

I'm lost in finding a solution to this. Could someone help me?

Thanks a lot =)

You either have to publish the MariaDB port(s) from the MariaDB container or add both Logstash and MariaDB to the same Docker network. The latter is a good idea because it lets you address containers via their names, i.e. your Logstash configuration file can say

jdbc_connection_string => "jdbc:mariadb://name-of-mariadb-container:3306/heaven"

instead of having a dynamic IP address hardcoded.

This is a Docker question more than it is an Elastic question.

Thanks a bunch, @magnusbaeck

it works just fine now!

This is a Docker question more than it is an Elastic question.

Definitely it was! Thanks anyway for clearing it up.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.