Hi folks, I'm trying to understand what the best setup is to guarantee zero lost messages.
I know logstash will not purposely lose messages, however I have seen references of people pulling in Redis to ensure they don't lose messages, which seems like an odd requirement. I thought in this configuration redis is just acting like a bigger 'insert' buffer into Elasticsearch, since it can get slow when dealing with lots of inserts (I'm assuming elasticsearch itself will not drop inserts if its under heavy load and instead just gets slow).
I read that logstash uses a 20 message buffer, broken into 3 phases. (equalling a total of 60 messages in the pipeline).
Question: if logstash crashes with a full pipeline, when it comes back online will it pick up those messages again? or does it treat those messages as already processed?