i am sending our production logs to logstash cluster which has 6 server in cluster. In my filebeat setting i have number of workers as 2 . So 12 pipelines to logstash cluster each prod server.
But sometimes due to production traffic spikes our application logs gets flooded and get very big in size in few minutes. I dont want to crash my logstash cluster as it is used to parse other applications logs too.
I am continuously tailing my log file .
Is there a way that i can restrict limit on sending logs at a time and filebeat keeps sending data in this kind of situations.