Kafka logstash plugin on a clustered zookeeper setup

Hi,

I have a clustered setup for zookeeper and kafka. I wanted to parse the kafka topic (multiple topic) logs to logstash.

  1. What is the best way to do so?
  2. Will kafka-logstash plugin be able to solve my problem?
  3. Do we have to install logstash on the server where zookeeper is running? I want logstash to be installed on a seprate instance.
  1. That's a very broad question.
  2. Yes, Logstash with the kafka input plugin can pull messages from Kafka.
  3. You can run Logstash on any machine that can connect to Kafka.

kafka input plugin will read messages from the kafka topic right..? I don't want to use kafka as a messaging queue for logstash. Kafka is used for some other component message queuing. I just want to parse the logs of each topic to ELK stack.

kafka input plugin will read messages from the kafka topic right..?

Yes.

I don't want to use kafka as a messaging queue for logstash. Kafka is used for some other component message queuing. I just want to parse the logs of each topic to ELK stack.

Sure. Logstash doesn't know if the topic is Logstash-specific or if there's some other piece of software that publishes messages to the topic.

Great, but is thr a way by which we can get the logs of topics in kafka using this plugin or any-other plugin?

To me "logs" in a Kafka context means the message store where messages published to a topic end up, and from which consumers (like Logstash) can consume messages. Are you talking about something else?