How to limit filebeat logs to logstash

Hi,
i am sending our production logs to logstash cluster which has 6 server in cluster. In my filebeat setting i have number of workers as 2 . So 12 pipelines to logstash cluster each prod server.

But sometimes due to production traffic spikes our application logs gets flooded and get very big in size in few minutes. I dont want to crash my logstash cluster as it is used to parse other applications logs too.

I am continuously tailing my log file .

Is there a way that i can restrict limit on sending logs at a time and filebeat keeps sending data in this kind of situations.

Thanks,
Nikhil

You can not limit event rates from filebeat directly. You are advised to you OS/network tooling to install whatever policies required. See: https://www.elastic.co/guide/en/beats/filebeat/current/faq.html#bandwidth-throttling

As logstash is the server and you want to protect logstash from being overloaded by logs, You should consider to apply QoS rules on the Logstash server. With network or the server having rate limiting in place, this will create back-pressure in filebeat and slow down filebeat. You might even consider a time-scheduled policy, reducing bandwidth for filebeat at peak times even more.

Furthermore, if you add more filebeat instance to your environment, the overall bandwidth used will not change (you don't have to re-balance all filebeat instances). Removing a filebeat instance or having a machine down for maintenance frees up bandwidth for other beats to use.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.