Hi,
We are using logstash-6.2.4 + kafka 1.1 (confluent) successfully with multiple topics and multiple logstash pipelines. If we use anonymous (PLAINTEXT on default port 9092) there are no issues when taking the complete cluster down and up again (meaning no brokers are available). Consumers will detect that brokers are back en resume.
However, when enabling SASL_SSL, all logstash pipelines with kafka-input will terminate when the cluster comes back up again with error:
[2018-05-31T13:29:44,169][ERROR][org.apache.kafka.clients.NetworkClient] [Consumer clientId=mobi_logs-0, groupId=internal_in] Connection to node 4 failed authentication due to: Authentication failed due to invalid credentials with SASL mechanism SCRAM-SHA-256
Exception in thread "Ruby-0-Thread-22: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.0.6/lib/logstash/inputs/kafka.rb:241" org.apache.kafka.common.errors.SaslAuthenticationException: Authentication failed due to invalid credentials with SASL mechanism SCRAM-SHA-256
I see no reason why this should not work, however I am not sure if the problem is in logstash-kafka-input, kafka-clients or in the broker.
Any input is appreciated.
Configs are listed below.
output {
kafka {
client_id => "a_client"
topic_id => "xyz"
codec => json
bootstrap_servers => "001:9094,002:9094,003:9094,004:9094"
jaas_path => "/path/logstash.jaas.conf"
security_protocol => "SASL_SSL"
sasl_mechanism => "SCRAM-SHA-256"
ssl_truststore_location => "/etc/pki/java/cacerts"
ssl_truststore_password => "xxxx"
}
}
logstash.jaas.conf
KafkaClient {
org.apache.kafka.common.security.scram.ScramLoginModule required
username="logstash"
password="logstash-secret";
};