Logstash handling duplicates using UUID

I have requirement to use a mechanism to avoid duplicates on events received from log data. These duplicates are being caused by ES or Logstash indexer crashes.

I was going through this link:

https://www.elastic.co/blog/logstash-lessons-handling-duplicates

Looks like UUID option works well for us as we use a queue in between logstash shippers and indexers.
But, since we have multiple logstash shippers (producers) and each generates this UUID, is it possible that we run into issues where same UUID is generated by 2 or more shippers (producers) ? If this happens, we might be updating some existing events, even though unintended.

Please advise. Thanks !!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.