Filebeat not able to send data to TLS enabled Kafka cluster

Hi Team,

We are trying to send some logs from a Windows host using filebeat to a Kafka cluster that uses TLS for encryption and Kerberos for Authentication.

Filebeat version: 7.9.2

On the Filebeats logs I get this message:

2020-10-12T02:22:54.536Z	INFO	[publisher_pipeline_output]	pipeline/output.go:143	Connecting to kafka(<kafka-broker-hostname>:9093)
2020-10-12T02:22:54.536Z	INFO	[publisher]	pipeline/retry.go:219	retryer: send unwait signal to consumer
2020-10-12T02:22:54.536Z	INFO	[publisher]	pipeline/retry.go:223	  done
2020-10-12T02:22:54.536Z	DEBUG	[kafka]	kafka/client.go:96	connect: [<kafka-broker-hostname>:9093]
2020-10-12T02:22:54.536Z	INFO	[publisher_pipeline_output]	pipeline/output.go:151	Connection to kafka(<kafka-broker-hostname>:9093) established
2020-10-12T02:22:55.330Z	DEBUG	[kafka]	kafka/client.go:277	finished kafka batch
2020-10-12T02:22:55.330Z	DEBUG	[kafka]	kafka/client.go:291	Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
2020-10-12T02:22:55.330Z	INFO	[publisher]	pipeline/retry.go:219	retryer: send unwait signal to consumer
2020-10-12T02:22:55.330Z	INFO	[publisher]	pipeline/retry.go:223	  done
2020-10-12T02:22:55.560Z	DEBUG	[harvester]	log/log.go:107	End of file reached: C:\Temp\Test\note1 - Copy.csv; Backoff now.
2020-10-12T02:22:56.121Z	DEBUG	[kafka]	kafka/client.go:277	finished kafka batch
2020-10-12T02:22:56.121Z	DEBUG	[kafka]	kafka/client.go:291	Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
2020-10-12T02:22:56.121Z	INFO	[publisher]	pipeline/retry.go:219	retryer: send unwait signal to consumer
2020-10-12T02:22:56.121Z	INFO	[publisher]	pipeline/retry.go:223	  done

I enabled debug logging on Kafka cluster and I can see the following error consistently:

2020-10-12 02:22:56,793 DEBUG org.apache.kafka.common.network.Selector: [SocketServer brokerId=11] Connection with 192.168.5.25/192.168.5.25 disconnected
java.io.EOFException: EOF during handshake, handshake status is NEED_UNWRAP
        at org.apache.kafka.common.network.SslTransportLayer.handshakeUnwrap(SslTransportLayer.java:494)
        at org.apache.kafka.common.network.SslTransportLayer.doHandshake(SslTransportLayer.java:339)
        at org.apache.kafka.common.network.SslTransportLayer.handshake(SslTransportLayer.java:265)
        at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:129)
        at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:532)
        at org.apache.kafka.common.network.Selector.poll(Selector.java:467)
        at kafka.network.Processor.poll(SocketServer.scala:689)
        at kafka.network.Processor.run(SocketServer.scala:594)
        at java.lang.Thread.run(Thread.java:748)

Filebeats doesn't show any relevant log message, whereas in Kafka this message is very cryptic: java.io.EOFException: EOF during handshake, handshake status is NEED_UNWRAP

I think the connection is not even reaching authenticaition stage, it is failing while doing tls handshake.

Does Filebeat support sending output to SSL_SASL enabled Kafka?

Any help is much appreciated.

Could you please share your Filebeat configuration formatted using </>?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.