How we can drop log line using filebeat

Hello Team,

I am aware how we can exclude a particular log type (line) using filebeat and i have implemented it and its working fine.

But now i am getting 20 line of same log type and i want to exclude 19 of them at filebeat level and want to send only one line to elasticserach.

Below are the the sample log line:

Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz
Nov  2 09:46:44 xyz sshd[32511]: Accepted publickey for xyz

is it possible to drop all logs line except one?

I have tried but that drops all lines.

Any assistance will be appreciated.

Thanks.

Hi @Tek_Chand,

Currently there is no way to remove duplications from filebeat, and in general each event is independent to the previous ones. Removing duplications can be too dependant of the use case.

I guess it could be possible to implement something like multiline, but to detect consecutive duplicated log lines. A field could be added with the number of repetitions too. You can open an issue with this feature request.

@Jaime, Thank you for your response.

I have open a issue at Github. Below is the id of issue:
[http://Duplication removal at filebeat level. #9033](http://Duplication removal at filebeat level. #9033)

Thanks.

I know it's not quite the answer you're looking for, but if you can first send the events to a Logstash node, you could use the throttle filter or even the drop filter with percentage option.

@Philip, Thank you for your suggestion. I will read about throttle and drop filter with percentage.

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.