ELK + redis query

Hi all,

I've set up ELK on AWS in a fault-tolerant configuration (multi-AZ), and
have been looking at integrating redis into the stack to ease the load on
logstash, as is commonly recommended. However, it seems to me that this
just introduces a single point of failure into an otherwise redundant
setup. While I gather that redis can be clustered, I have yet to find any
documentation or how-tos that focus on using clustered redis as part of a
fault-tolerant ELK setup.

I have my logstash instances load balanced and could theoretically scale
out that tier if those instances were to become overloaded. Would anyone
recommend this as a suitable alternative to having a single redis node?

Thanks,
Guy

1 Like

It's an option, yes. But you'd have to figure out how an instance is overloaded.

Thanks for that info Mark. Following on from my question, are you aware of any examples of creating a high availability redis setup for ELK?

Actually, never mind that question. I did some further research and I realised the solution myself. Thanks again for your input.

Care to share so others can learn?

Sure. I realised I can just configure all the redis servers as individual inputs in my logstash config. This google group post discusses it in more detail: https://groups.google.com/forum/#!topic/logstash-users/8Km9VFqapig.

Another option is to put a redis instance on each logstash server, that way you just point logstash at 127.0.0.1:6379. However, if any logstash server completely dies then you'll also lose the events in the redis queue on that server, so this option would probably be my second choice.