Redis Vs Kafka Vs RabbitMQ in ELK


I am going to use ELK for my log analysis/monitoring. I need to use broker between logstash shipper and logstash indexer. Which tool Redis or Kafka or RabbitMQ I should use for this ?

Please help me on this.

I've never used Redis or Kafka but I am using RabbitMQ and I am liking it so far. But my cluster is small so I haven't seen how it performs with a large amount of data. It seems to work well with Logstash from what I've seen so far. I've never had problems with it except in the beginning just because of the learning curve.

Thanks, may I know how much data per day you are using for this..

At least 15 GB a day.

If you edit your post and move it to the Logstash category you might get the attention from more people. The current category is for the deprecated logstash-forwarder tool but that's not what your question is about.

Either Redis, RabbitMQ, or Kafka will work for what you want to do. What's best depends on your needs.


Can you please give some scenario where I can use Redis or Kafka or RabbitMQ

This is confusing. You say you want to use a broker and want advice on which one to pick, and then you follow up and request scenarios where a broker can be used.

A broker acts as a queue from which one or more Logstash instances can pull events. It's therefore useful for load balancing requests but also acts as a buffer when there are surges of incoming events (which is especially relevant when you have events arriving from sources that don't buffer locally themselves, i.e. if you can't receive their events they will be dropped on the floor).