ELK + redis query


(Guy) #1

Hi all,

I've set up ELK on AWS in a fault-tolerant configuration (multi-AZ), and
have been looking at integrating redis into the stack to ease the load on
logstash, as is commonly recommended. However, it seems to me that this
just introduces a single point of failure into an otherwise redundant
setup. While I gather that redis can be clustered, I have yet to find any
documentation or how-tos that focus on using clustered redis as part of a
fault-tolerant ELK setup.

I have my logstash instances load balanced and could theoretically scale
out that tier if those instances were to become overloaded. Would anyone
recommend this as a suitable alternative to having a single redis node?

Thanks,
Guy


(Mark Walkom) #2

It's an option, yes. But you'd have to figure out how an instance is overloaded.


(Guy) #3

Thanks for that info Mark. Following on from my question, are you aware of any examples of creating a high availability redis setup for ELK?


(Guy) #4

Actually, never mind that question. I did some further research and I realised the solution myself. Thanks again for your input.


(Mark Walkom) #5

Care to share so others can learn?


(Guy) #6

Sure. I realised I can just configure all the redis servers as individual inputs in my logstash config. This google group post discusses it in more detail: https://groups.google.com/forum/#!topic/logstash-users/8Km9VFqapig.

Another option is to put a redis instance on each logstash server, that way you just point logstash at 127.0.0.1:6379. However, if any logstash server completely dies then you'll also lose the events in the redis queue on that server, so this option would probably be my second choice.


(system) #7