Event send Kafka to logstash

Hi,
Hope all doing good. I have some doubt about using Kafka. In my end, we are just planning to receive data from Kafka to Logstash. In Logstash configuration I have enclosed the Kafka as an input filter and Logstash is up and running, but fail to create an index in Elasticsearch.

My question is whether I need Filebeat between Kafka and Logstash or else we can perform without Filebeat.

Currently, I'm not using Filebeat on my servers. Kindly provide your valuable comments on this.

Thanks in advance.

In Logstash configuration I have enclosed the Kafka as an input filter and Logstash is up and running, but fail to create an index in Elasticsearch.

Temporarily replace your elasticsearch output with a stdout { codec => rubydebug } output to debug what part of your pipeline isn't working. Right now we don't know if it's the kafka input that can't consume messages or the elasticsearch output that doesn't send to ES.

My question is whether I need Filebeat between Kafka and Logstash or else we can perform without Filebeat.

Filebeat can't consume from Kafka so using Filebeat as a middleman isn't even possible.

1 Like

I'm running logstash as an docker service and how could i verify its connected with kafka.

" stdout { codec => rubydebug }" its only way to find it or else we can find any other way

The point is to use an output that can't fail. If you want to use a file output that just writes all incoming events to a file that'd be fine as well.

I'd also look at Logstash's own logs. If they're quiet turning up the loglevel could reveal interesting information.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.