Corrupt gzipped input file causes index to grow and saturate storage

Hi,

I have a bug I'd like to report/feature I'd like to request (but where do we do this?). But before I'd like to know if some of you found a work-around. I've observed that on logstash 6.5.0 and 6.6.0.

I'm using a logstash pipeline reading data from gzipped files. For some reason a once input file is corrupted. The logstash input plugins logs an "unexpected end of file exception" and is restarted. Shortly after the restart, the exception is again raised, and the input plugin keeps restarting. But between restarts, logstash sends again and again a document to the output (ES), and so my index keeps growing with duplicate documents...

I'm looking for a quick fix, so has anyone experienced this before and how did you work around it?

Best regards

Olivier

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.