Multiple Filebeat instances with separate configuration files to different Logstash containers

Is it possible to run multiple instances of Filebeat with different filebeat.yml config files?

I need to send log lines to different servers running Logstash.

Thx, Keith :^)

hi @kmiklas, can you confirm the scenario here, do you want to send same data to multiple logstash server simultaneously or are you talking about a load balance scenario? If the latter, have a look here https://www.elastic.co/guide/en/beats/filebeat/current/load-balancing.html.
Else, the first option is not supported, more on this here: Support multiple outputs of the same type (like two independent Logstash clusters) · Issue #1035 · elastic/beats · GitHub.

Alternatives to this scenario were previously advised in a different discuss ticket:

  1. run multiple filebeat instance, each with different registry file, to have separate state for each cluster to send traffic too.
  2. have filebeat push to kafka + use 2 LS instances/clusters (1LS per required output) with different consumer-groups. The consumer groups uncouple the systems
  3. Use redis publish-subscribe (type: channels) to push events. Drawback: if LS is down, one ES cluster might not get any data (potential data loss).
  4. implement some kind of replication: Still couple the systems, but have filebeat send to LS and LS push to local ES + queue(file, S3, redis, kafka, whatever). Have second LS forward the logs in queue to second ES instance.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.