Filter Duplicate Logs


We built the pipeline of Filebeat+Logstash+Elasticserach+Kibana in a Kafka cluster. But when there is an issue happening in the cluster, same logs related to this issue keep generating before the issue is resolved. This takes a lot of space in our Elasticsearch.

Is there any way to filter the duplicate logs generated continuously in Logstash?


Solved. Using throttle.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.