hello community! I have the following configuration: 9 nodes, 2 redis servers and 3 logstash servers
my logs will go to the redis server (it will work as a broker) -> then to logstash -> and to elasticsearch
the question is: how can I set up an input to logstash so that if one of the redis servers stops working, then the logstash will automatically take data from the second redis server?
I tried to collect data from both redis servers using logstash at the same time, but then the data is duplicated in the index.
The logstash redis input plugin does not support redis in cluster mode, there is an open old issue from 7 years ago.
If you can change the broker from Redis to Kafka, you will be able to easily achieve what you want, if you can not change the broker from Redis to Kafka I can think of two alternatives.
One would be to manually change the pipeline when one of the redis server is down, so Logstash stop consuming from one redis and start consuming from the other redis, this can be automated using some script.
Other option would be to use a load balancer like HAProxy in front of your redis and point Logstash to this load balancer, in the load balancer you configure one redis as a backup so it would only receive requests if the other redis went down.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.