How to replicate the packet to redis hosts?

Latest filebeat has ability to loadbalance or send to either redis when multiple redis is configured.

https://www.elastic.co/guide/en/beats/filebeat/5.x/redis-output.html#_loadbalance

However, is there any way to replicate the packets to all the redis hosts?
I tried below configuration but packet was only sent to second hosts.

output.redis:
  hosts: ["172.30.1.192"]
  password: "redis"
  key: "filebeat"
  db: 0
  timeout: 5
  loadbalance: false

output.redis:
  hosts: ["172.30.1.110"]
  password: "redis"
  key: "filebeat"
  db: 0
  timeout: 5
  loadbalance: false

the settings from second redis section will overwrite the first one, in yaml parser. Unfortunately first section as is will never be presented to filebeat at loading time (config processing would warn and quit here).

What exactly you want replication for?

Replication is not supported on purpose, as users normally ask for replication in order to support sending events to multiple environments. e.g. production and development environments. See this github discussion why this might be a bad idea.

@steffens

My goal was to duplicate the events from filebeat , store it to multiple redis and
eventually store the events to different clusters of elastisearch.

But I think below explanation clarifies the concept of shipping events from beats.

Problem is, when having multiple logstash outputs in beats (doing event routing essentially), these logstash instances implicitly get coupled via beats. If one instance is down or unresponsive, the others won't get any data. A message queue like kafka will help to uncouple these systems as long as kafka is operating.

This topic was automatically closed after 21 days. New replies are no longer allowed.