Logstash 7.6.1 Exception: Sequel::DatabaseConnectionError

I am using docker-elk official image to setup.

My logstah conf file is

input {
jdbc {
jdbc_driver_library => "/opt/logstash/logstash-core/lib/jars/mysql-connector-java-5.1.48-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://domain:3306/world?user=root&password=*****"
jdbc_user => "root"
jdbc_password => *****"
statement => "SELECT * FROM world.users limit 100;"
}
}
output {
elasticsearch {
index => "customers"
document_type => "customer"
document_id=> "%{id}"
hosts => ["http://localhost:9200"]
}
}

logstash is getting started but then throwing Database Connectivity error.

root@816b08f53487:/# /opt/logstash/bin/logstash -f /etc/logstash/conf.d/30-output.conf --path.data /tmp/logstash/data

Sending Logstash logs to /opt/logstash/logs which is now configured via log4j2.properties
[2020-04-10T13:56:22,881][WARN ][logstash.config.source.multilocal] Ignoring the pipelines.yml file because modules or command line options are specified
[2020-04-10T13:56:23,152][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.6.1"}
[2020-04-10T13:56:26,428][INFO ][org.reflections.Reflections] Reflections took 112 ms to scan 1 urls, producing 20 keys and 40 values
[2020-04-10T13:56:27,371][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::Elasticsearch index=>"customers", id=>"c897917f2b731ac1f53cf3cd7e701b1aef136e6ac039b3a66232cff3525faa15", document_id=>"%{id}", hosts=>[http://localhost:9200], document_type=>"customer", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_f9c2196f-a946-40f8-a5d0-00cb1bf8f144", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_rollover_alias=>"logstash", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2020-04-10T13:56:28,413][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2020-04-10T13:56:28,735][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-04-10T13:56:28,910][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-04-10T13:56:28,917][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the type event field wont be used to determine the document _type {:es_version=>7}
[2020-04-10T13:56:29,013][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["http://localhost:9200"]}
[2020-04-10T13:56:29,162][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-04-10T13:56:29,272][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2020-04-10T13:56:29,280][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/30-output.conf"], :thread=>"#<Thread:0xebf72b run>"}
[2020-04-10T13:56:29,425][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-04-10T13:56:31,203][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-04-10T13:56:31,406][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2020-04-10T13:56:32,258][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
[2020-04-10T13:56:33,558][ERROR][logstash.inputs.jdbc ][main] Unable to connect to database. Tried 1 times {:error_message=>"Java::ComMysqlJdbcExceptionsJdbc4::CommunicationsException: Communications link failure\n\nThe last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server."}
[2020-04-10T13:56:33,604][ERROR][logstash.javapipeline ][main] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"root", jdbc_password=>, statement=>"SELECT * FROM world.users limit 5;", jdbc_driver_library=>"/opt/logstash/logstash-core/lib/jars/mysql-connector-java-5.1.48-bin.jar", jdbc_connection_string=>"jdbc:mysql://dev.jobprog.net.db:3306/world?user=root&password=******&useUnicode=true&characterEncoding=UTF-8", id=>"5fa5e37bca344860ffde656b59a0bbb7d03e7997798ad056b2c428856b014c35", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_82fa6fb9-3470-4179-8a06-08292777e90c", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, plugin_timezone=>"utc", last_run_metadata_path=>"/root/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true, use_prepared_statements=>false>
Error: Java::ComMysqlJdbcExceptionsJdbc4::CommunicationsException: Communications link failure

I have tried from a lot of other articles, which are suggesting to change the connection URL by appending username&password, but nothing worked for me.

Please let me know what is missing. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.