Filebeat ignore log entries with certain size


I am using filebeat to parse logs from my server directly to Elastic cloud. Sometimes there are some inconsistent logs entries with hundreds of log lines which I don't want to harvest them. Is there any way I can achieve this? Also is it possible that those logs are responsible for high memory pressure of my cluster?



You can use a processor like to drop the events you don't want to ship to ES.

About memory pressure, it could be anything however dropping this kind of unwanted events may improve things since it will reduce the amount of data ES has to index.


1 Like

Thanks for your response Chris.

Since there is no specific pattern to use a conditional to drop logs, I have used truncate field proccessor to set a maximum characters limit.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.