Logstash started to crash trying to connect to Kafka

Version: logstash 5.6.5
Plugin Version logstash-output-kafka
Operating System: RHEL7

{code}
[2019-07-31T17:43:04,243][INFO ][org.apache.kafka.clients.producer.KafkaProducer] Closing the Kafka producer with timeoutMillis = 0 ms.
[2019-07-31T17:43:04,248][ERROR][logstash.outputs.kafka ] Unable to create Kafka producer from given configuration {:kafka_error_message=>org.apache.kafka.common.KafkaException: Failed to construct kafka producer}
[2019-07-31T17:43:04,260][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::OutputDelegator:0xe9afc99 @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x684e19e9 @metric=#<LogStash::Instrument::Metric:0x13f5368c @collector=#<LogStash::Instrument::Collector:0x52537655 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0xb7436b2 @store=#<Concurrent::map:0x0000000006b8c8 entries=2 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x13ede764, @fast_lookup=#<Concurrent::map:0x0000000006b8cc entries=54 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3"]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x603ed86a @metric=#<LogStash::Instrument::Metric:0x13f5368c @collector=#<LogStash::Instrument::Collector:0x52537655 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0xb7436b2 @store=#<Concurrent::map:0x0000000006b8c8 entries=2 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x13ede764, @fast_lookup=#<Concurrent::map:0x0000000006b8cc entries=54 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @logger=#<LogStash::Logging::Logger:0x36ffbb90 @logger=#Java::OrgApacheLoggingLog4jCore::Logger:0x3ad80200>, @out_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", :events] key: out value: 0, @in_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", :events] key: in value: 0, @strategy=#<LogStash::OutputDelegatorStrategies::Shared:0x59a3154b @output=<LogStash::Outputs::Kafka topic_id=>"test-dev", bootstrap_servers=>"logakafka1-dev.test.net:9043,logakafka2-dev.test.net:9043,logakafka3-dev.test.net:9043", codec=><LogStash::Codecs::JSON id=>"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-2", enable_metric=>true, charset=>"UTF-8">, security_protocol=>"SSL", ssl_keystore_location=>"/ssl/keystore/loga-dev.producer.jks", ssl_truststore_location=>"/ssl/truststore/loga.client.truststore.jks", ssl_keystore_password=>, ssl_truststore_password=>, id=>"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", enable_metric=>true, workers=>1, acks=>"1", batch_size=>16384, block_on_buffer_full=>true, buffer_memory=>33554432, compression_type=>"none", key_serializer=>"org.apache.kafka.common.serialization.StringSerializer", linger_ms=>0, max_request_size=>1048576, metadata_fetch_timeout_ms=>60000, metadata_max_age_ms=>300000, receive_buffer_bytes=>32768, reconnect_backoff_ms=>10, retry_backoff_ms=>100, send_buffer_bytes=>131072, ssl=>false, sasl_mechanism=>"GSSAPI", timeout_ms=>30000, value_serializer=>"org.apache.kafka.common.serialization.StringSerializer">>, @id="0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", @time_metric=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", :events] key: duration_in_millis value: 0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x12cb06d4 @metric=#<LogStash::Instrument::Metric:0x13f5368c @collector=#<LogStash::Instrument::Collector:0x52537655 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0xb7436b2 @store=#<Concurrent::map:0x0000000006b8c8 entries=2 default_proc=nil>, @structured_lookup_mutex=#Mutex:0x13ede764, @fast_lookup=#<Concurrent::map:0x0000000006b8cc entries=54 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :"0ee478d8f9b8f2874e16e0bec48b99ffc8711c60-3", :events]>, @output_class=LogStash::Outputs::Kafka>", :error=>"Failed to construct kafka producer"}
[2019-07-31T17:43:04,291][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>org.apache.kafka.common.KafkaException: Failed to construct kafka producer, :backtrace=>["org.apache.kafka.clients.producer.KafkaProducer.(org/apache/kafka/clients/producer/KafkaProducer.java:335)", "org.apache.kafka.clients.producer.KafkaProducer.(org/apache/kafka/clients/producer/KafkaProducer.java:188)", "RUBY.create_producer(/tools/logstash-5.6.5/vendor/bundle/jruby/1.9/gems/logstash-output-kafka-5.1.11/lib/logstash/outputs/kafka.rb:334)", "RUBY.register(/tools/logstash-5.6.5/vendor/bundle/jruby/1.9/gems/logstash-output-kafka-5.1.11/lib/logstash/outputs/kafka.rb:195)", "RUBY.register(/tools/logstash-5.6.5/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9)", "RUBY.register(/tools/logstash-5.6.5/logstash-core/lib/logstash/output_delegator.rb:43)", "RUBY.register_plugin(/tools/logstash-5.6.5/logstash-core/lib/logstash/pipeline.rb:290)", "RUBY.register_plugins(/tools/logstash-5.6.5/logstash-core/lib/logstash/pipeline.rb:301)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.register_plugins(/tools/logstash-5.6.5/logstash-core/lib/logstash/pipeline.rb:301)", "RUBY.start_workers(/tools/logstash-5.6.5/logstash-core/lib/logstash/pipeline.rb:310)", "RUBY.run(/tools/logstash-5.6.5/logstash-core/lib/logstash/pipeline.rb:235)", "RUBY.start_pipeline(/tools/logstash-5.6.5/logstash-core/lib/logstash/agent.rb:408)"]}
[2019-07-31T17:43:04,364][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-07-31T17:43:07,333][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
{code}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.