Is is possible to create multiple logstash nodes running the same pipeline to provide high availability? Like if one of the logstash node is down, the beat will push data into another node to make it work.
I look at this page https://www.elastic.co/guide/en/logstash/current/deploying-and-scaling.html, but does not understand the "scalability" section very clearly. It says:
It’s common to deploy just one Beats input per Logstash node, but multiple Beats inputs can also be deployed per Logstash node to expose independent endpoints for different data sources.
So it sounds like each Beats will write to one specific Logstash node. When this node is down, the data from this Beat will be blocked.
Any help or thoughts will be appreciated. Thanks!