I'm performing duplicate data removing process using fingerprint filter in logstash. I have an elastic cloud setup. Logstash reads data from my 3 node elastic cloud cluster and writes to the same cluster without any duplicate data.
While performing this operation I came across multiple short interval down times of my elastic cluster. Because of down times of elasticsearch, I can't verify whether any data loss had happened since I have a very large index.
So I want to know what happens to data when logstash can't send to elasticsearch due to failure in elasticsearch side.
Does logstash keep that in memory until elasticsearch is reachable?
or I loss data when elasticsearch is unreachable?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.