Aggregate multiple filebeat

There are multiple filebeat nodes that send data to logstash. The logstash needs to aggregate events together and send to elasticsearch. How logstash needs to be configured to avoid being a single point of failure.

The logstash needs to aggregate events together and send to elasticsearch.

Exactly what do you mean by this?

How logstash needs to be configured to avoid being a single point of failure.

You can e.g. run multiple Logstash instances and update the Filebeat configuration to point to all of them. Then Filebeat will pick one of them and if that instance goes down it'll switch to the other one.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.