Duplicate records found


(Sukumar Chakka) #1

Hi ,
I received some duplicate records in Elasticsearch, I hope one of the reason is due to time exceeded to send acknowledgement to kafka. But if it is the case, where should i increase it. By the way we are using the below pipeline to process the logs.

Filebeat pushing events to kafka. And logstash requests events to kafka to apply grok. Please provide your inputs where there is a chance of duplicating records or Is there any timeout configurations for acknowledgment to increase?


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.