Beats vs multiple outputs while transisting to a new elastic cluster

Running a PoC with filebeat + metricbeat agents on a number of endpoints sending data through ingest nodes running various pipelines on the filebeat events.

While preparing move to a new elastic cluster, we would like to duplicate events, so they still hit our PoC cluster and the newly prepared coming production cluster, and thus have events send to both clusters. Would this be possible somehow?

Comming Production Cluster also have a Logstash setup, which we might parse data through (without any processing) is beats can be configured to send data to two outputs, a logstash as well as an elastic output.

TIA

Unfortunately no, beats only supports one output.

If you want to send the same data from beats to different clusters you will need to send it to logstash and have logstash send the data to the different clusters.

Or send it to kafka and use Logstash to consume and send the data to the different clusters.

Right, thanks, will try to proxy through logstash, also believe I did this in the past when I used two different ES clusters for different data through same logstash instances,

Also seen in this older thread or try to read this article though it's more on splitting than duplicating.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.