Kafka output plugin not working

Hi, I am trying to push logs onto a kafka topic. Has anybody successfully used Kafka output plugin in logstash?

Tried with logstash version: 6.3.2/6.4.0 but same result.

Below is my config file:
input {
file
{
path => "/tmp/sat.txt"
}
}

filter{}

output
{
stdout { codec => rubydebug }
kafka {
codec=> "plain"
topic_id => "prod-esb2"
bootstrap_servers => "kafka.mydomain.com:9092"
}
}

This is the error I see in logstash in debug mode:
[2018-09-04T16:29:32,961][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Initialize connection to node kafka.mydomain.com:9092 (id: -1 rack: null) for sending metadata request
[2018-09-04T16:29:32,961][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Initiating connection to node kafka.mydomain.com:9092 (id: -1 rack: null)
[2018-09-04T16:29:32,999][DEBUG][org.apache.kafka.common.metrics.Metrics] Added sensor with name node--1.bytes-sent
[2018-09-04T16:29:33,004][DEBUG][org.apache.kafka.common.metrics.Metrics] Added sensor with name node--1.bytes-received
[2018-09-04T16:29:33,006][DEBUG][org.apache.kafka.common.metrics.Metrics] Added sensor with name node--1.latency
[2018-09-04T16:29:33,007][DEBUG][org.apache.kafka.common.network.Selector] [Producer clientId=producer-1] Created socket with SO_RCVBUF = 32768, SO_SNDBUF = 131072, SO_TIMEOUT = 0 to node -1
[2018-09-04T16:29:33,176][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Completed connection to node -1. Fetching API versions.
[2018-09-04T16:29:33,177][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Initiating API versions fetch from node -1.
[2018-09-04T16:29:33,190][DEBUG][org.apache.kafka.common.network.Selector] [Producer clientId=producer-1] Connection with kafka.mydomain.com/10.250.191.80 disconnected
java.io.EOFException: null
at org.apache.kafka.common.network.NetworkReceive.readFromReadableChannel(NetworkReceive.java:124) ~[kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:93) ~[kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:235) ~[kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:196) ~[kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:557) ~[kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:495) [kafka-clients-1.1.0.jar:?]
at org.apache.kafka.common.network.Selector.poll(Selector.java:424) [kafka-clients-1.1.0.jar:?]
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:460) [kafka-clients-1.1.0.jar:?]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:239) [kafka-clients-1.1.0.jar:?]
at org.apache.kafka.clients.producer.internals.Sender.run(Sender.java:163) [kafka-clients-1.1.0.jar:?]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_45]
[2018-09-04T16:29:33,198][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Node -1 disconnected.
[2018-09-04T16:29:33,200][DEBUG][org.apache.kafka.clients.NetworkClient] [Producer clientId=producer-1] Give up sending metadata request since no node is available

OK, I seem to have made some progress here. I copied the kafka-clients.jar (0.9.0) from the broker to the logstash vendor lib directory. And also modified the ruby file logstash-outpu
t-kafka_jars.rb to pick up the 0.9.0 version of the jar. This looks to be working.

But clearly the backward compatibility is broken OOTB!

@logstash @elastic team, could you please confirm whether this is a bug, so that I can open bug on the git repo?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.