Hi
the problem occurs when collect data from kafka then send to logstash.
the filebeat failed to connect kafka
kafka version: 2.5.1
filebeat version: 8.6.2
kafka server config
security.inter.broker.protocol=SASL_PLAINTEXT
sasl.mechanism.inter.broker.protocol=SCRAM-SHA-256
sasl.enabled.mechanisms=SCRAM-SHA-256
allow.everyone.if.no.acl.found=false
super.users=User:admin
authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer
listeners=SASL_PLAINTEXT://0.0.0.0:8080
advertised.listeners=SASL_PLAINTEXT://0.0.0.0:8080
socket.send.buffer.bytes=102400
socket.receive.buffer.bytes=102400
socket.request.max.bytes=104857600
filebeat config
filebeat.inputs:
- type: kafka
tags: ["kafka-client"]
enabled: true
hosts:
- kafka1:8080
topics: ["error-test"]
group_id: test-api
initial_offset: "oldest"
sasl.mechanism: SCRAM-SHA-256
ssl.enabled: true
username: testuser
password: xxxx
when startup filebeat print error log client has run out of available brokers to talk to (Is your cluster reachable?
kafka server also print exception
[2023-04-21 15:15:17,413] WARN [SocketServer brokerId=0] Unexpected error from /192.168.0.9; closing connection (org.apache.kafka.common.network.Selector)
org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 369295616 larger than 524288)
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:105)
at org.apache.kafka.common.security.authenticator.SaslServerAuthenticator.authenticate(SaslServerAuthenticator.java:246)
at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:176)
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:547)
at org.apache.kafka.common.network.Selector.poll(Selector.java:485)
at kafka.network.Processor.poll(SocketServer.scala:861)
at kafka.network.Processor.run(SocketServer.scala:760)
at java.lang.Thread.run(Thread.java:750
similar filebeat config to connect no authentication kafka work fine