Hi @suppandi and welcome
In general it is not possible to ship logs from an only filebeat instance to multiple outputs, you have some options though:
- Send all logs to Logstash, and do the output selection there using conditionals. When using multiple outputs with logstash take into account that if one of them is not available, all events processing is blocked, even for other outputs.
- Send all logs to Kafka, and have two logstash instances reading and filtering, one for each elasticsearch cluster. This doesn't have the disadvantages of the previous option, but requires more infrastructure (a Kafka cluster, one logstash instance per cluster...).
- Deploy multiple filebeats in your nodes, one for each output, all of them with the same configuration except for:
- A different registry file, so they can independently keep track of processed logs
- A different output configuration
- A processor to drop events each output is not interested on
I hope this helps to think on a solution for your case.