How to prevent duplicate log

Hi there,

I have found something odd in my elastic cluster. I found the exact same log showing up twice in the message field

as you can see. the logs have the same timestamp, same X-Request-ID all the same. but why does it appear twice? can you tell me the reason? and how to prevent it?

fyi, this logs are from OCP and i'm using elasticsearch 7.13 same with logstash


Hello @yuswanul ,

In case you're using rollover index for ILM along with logstash output filter,this might be the reason you're getting same data in multiple rollver indices.

Will require to check how you're sending data along with logstash config file.

If you look on the JSON tab of an expanded event in the Kibana Discovery pane is the message field in question an array? If so, what does your filter section look like? Are you using grok or mutate to add a field called [message]?

After reviewing logstash pipeline, i think it's because grokparsefailure. I'm using grok filter and mutate (remove) in the pipeline for this index. Maybe i will start to update the logstash pipeline. Thanks for the help

Pada tanggal Sel, 3 Jan 2023 00.49, Badger via Discuss the Elastic Stack <> menulis:

probably not what's happening here, but curious. Is logstash pulling events from Kafka?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.