Logstash input from multiple redis servers

hello community! I have the following configuration: 9 nodes, 2 redis servers and 3 logstash servers

my logs will go to the redis server (it will work as a broker) -> then to logstash -> and to elasticsearch

the question is: how can I set up an input to logstash so that if one of the redis servers stops working, then the logstash will automatically take data from the second redis server?

I tried to collect data from both redis servers using logstash at the same time, but then the data is duplicated in the index.

my logstash config:

input {
    redis {
        host => "10.200.9.170"
        port => "6379"
        data_type => "list"
        key => "logs"
        codec => "plain"
        type => "logs"
    }

The logstash redis input plugin does not support redis in cluster mode, there is an open old issue from 7 years ago.

If you can change the broker from Redis to Kafka, you will be able to easily achieve what you want, if you can not change the broker from Redis to Kafka I can think of two alternatives.

One would be to manually change the pipeline when one of the redis server is down, so Logstash stop consuming from one redis and start consuming from the other redis, this can be automated using some script.

Other option would be to use a load balancer like HAProxy in front of your redis and point Logstash to this load balancer, in the load balancer you configure one redis as a backup so it would only receive requests if the other redis went down.

hi, thank you for your answer

it’s a pity that they didn’t decide to work with the redic cluster

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.