Recompute event.hash for documents in cold indices

For certain datasets which contain important log events, logstash pipeline computes event.hash, using fingerprint filter with SHA256 method and a key, over all the event fields. The events are then ingested into elasticsearch.

Now when the documents make it to the cold shards (based on their age), from time to time I need to ensure the documents have not been tampered with, by recomputing the hash again and comparing it to the stored one.

I was thinking to create a search on the dataset with a runtime field mapped to a painless script which re-computes the hash. The search would return only documents where the computed hash differs from the stored hash, hopefully including only important fields such as id, timestamp, message, original event.hash and computed hash.

The search then could be triggered from a Kibana watcher. However I am not clear on how to structure such a search. Please could you spare an advice?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.