Logstash not reading message from Kafka

I am testing a simple pipeline - Filebeat > Fafka > Logstash > File.

Logstash is not reading from Kafka, but I see Kafka has messages when i use this command -

bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic MyTopic --from-beginning

My file beat configuration -

filebeat.prospectors:

  • input_type: log
    paths:
    • /root/LogData/input.log

output.kafka:
hosts: ["10.290.18.14:9092"]

topic: MyTopic
partition.round_robin:
reachable_only: false

required_acks: 1
compression: none
max_message_bytes: 1000000

My Logstash configuration -

input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["MyTopic"]
}

}

output {
file {
path => "/usr/share/logstash/test_out.log"
}
}

My Fafka & Logstash are running on same VM, I am using Logstash docker, I started the docker of Logstash as follows -

sudo docker run -d --name logstash --expose 25826 -p 25826:25826 docker.elastic.co/logstash/logstash:5.4.0 --debug

Filebeat running on different VM.
I am creating topic as follows -

bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic MyTopic

I am using -
kafka_2.11-0.10.2.0
Logstash 5.4
Filebeat 5.4
(I tried without docker, directly on Ubuntu, but still not working)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.