We're using redis to pass data between logstash instances. I have 2 logstash instances currently with the same config:
input {
redis {
data_type => "channel"
host => "${REDIS_HOST}"
key => "logs"
id => "logs"
}
}
This results in duplicate data in Elasticsearch, so I'm guessing each logstash node will read the same data and write it to elasticsearch.
Is it possible to use the redis input in a way where this doesn't occur? I would assume logstash would use some mechanism to ensure that data duplication didn't occur. I can't see anything in the documentation about how to go about solving this.