Logstorm after the Logstashserver become for some time unavailable


I have a small setup with 20 clients (logstash-forwarder) and 1 logstash,elasticsearch,kibana server. Now the destination server for my logstash-forwarders becomes for some time (2 days) unavailable. After restart the system, all logstash-forwarder want to send their buffered logs at one time. The logstashserver was overloaded and no logs can be written.
I Do a workaround, that the logstashserver listen to a new port and then every machine logstash-forwarder was configured by hand to the new port.

Is there another way, to handle this problem?

Have a read through https://www.elastic.co/guide/en/logstash/current/deploying-and-scaling.html, but you probably need to look into a broker to act as a buffer.

Thank you!

Is redis the state of the art as broker?

I'd say kafka is a little more advanced than redis these days.