How to ignore the same log contents in the different log files?


(alicia) #1

Hi,

We have logs files in different repertories, same logs contents can find several times in different files.
How to ignore the same log contents in the different log files in order to send only once a same log content to ES?

Thanks.


(Magnus Bäck) #2

Logstash has no standard plugins to de-duplicate logs, but if you compute the Elasticsearch document id based on the message contents you can de-duplicate on the ES side (i.e. you'll send all documents to ES but the duplicates will overwrite each other, leaving a single copy of each).


(alicia) #3

Hi @magnusbaeck,

Thanks for your reply.

Sorry, i don't understand, how i can do this on the ES side?
You talk about use DSL query in Kibana?

Thanks.


(Magnus Bäck) #4

No, I meant that you use the fingerprint filter to compute a checksum of the event, store that checksum in a field that you reference in the document_id option of your elasticsearch output. If two identical events arrive they'll get the same document id and therefore won't be stored twice.


(alicia) #5

Thanks @magnusbaeck.

Do you have an example about that config?

Thanks.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.