Hi,
I have a clustered setup for zookeeper and kafka. I wanted to parse the kafka topic (multiple topic) logs to logstash.
- What is the best way to do so?
- Will kafka-logstash plugin be able to solve my problem?
- Do we have to install logstash on the server where zookeeper is running? I want logstash to be installed on a seprate instance.
kafka input plugin will read messages from the kafka topic right..? I don't want to use kafka as a messaging queue for logstash. Kafka is used for some other component message queuing. I just want to parse the logs of each topic to ELK stack.
kafka input plugin will read messages from the kafka topic right..?
Yes.
I don't want to use kafka as a messaging queue for logstash. Kafka is used for some other component message queuing. I just want to parse the logs of each topic to ELK stack.
Sure. Logstash doesn't know if the topic is Logstash-specific or if there's some other piece of software that publishes messages to the topic.
Great, but is thr a way by which we can get the logs of topics in kafka using this plugin or any-other plugin?
To me "logs" in a Kafka context means the message store where messages published to a topic end up, and from which consumers (like Logstash) can consume messages. Are you talking about something else?