Using Kafka as a message queue for Logstash

Hey all,

I am currently setting up an environment with Elastic and Logstash 7.3.1 to serve as a baseline SIEM environment. We have Logstash running on the app server as a pivot point and receiver for our beats and syslog, and we would like to have that data then indexed on our storage device. I would ideally like to use Kafka as a message queue in case something happens on the storage end, we won't lose our logs.

Does anyone know, do I need to have a Kafka server and a Zookeeper instance also running on our app server to facilitate the use of the Kafka output plugin, or will the output and input plugins work as a standalone without a Zookeeper instance running?


Not sure I follow the question, but if you are using a kafka output then obviously you need a Kafka instance, and that requires a Zookeeper instance. They do not have to be local to the logstash instance or dedicated to it.

Yeah, I was a bit jumbled there.

I had found some conflicting info about whether or not Kafka and Zookeeper needed dedicated instances to function as outputs for Logstash, or whether or not they could ride on another instance. I think, for the sake of ease and not getting wires crossed, I will set up an instance of Zookeeper on the app server instead of using another one of our instances to facilitate the log flow.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.