This is more of a follow up from these
Currently, the DLQ logs do not get deleted automatically - this is a feature that will be added in a future Logstash release.
A new DLQ log file is created either a) when Logstash is started up, or b) when a new event would cause the log file to be larger than the 'maximum segment size' - this is currently set to 10MB.
To view the Dead Letter Queue there is a dead letter queue input plugin. This can be processed like any other Logstash input - initially you can pipe to stdout,…
We are using the dead letter queue with the Elasticsearch output plugin, and the dead letter queue input to reprocess events. It seems to work quite well, except that the old entries in the dead letter queue are never removed, and eventually the dead letter queue fills up, and logstash refuses to write new entries.
Is there a way to clean up old entries? Without that the dead letter queue is practically useless.
Are the DLQs automatically cleaned up.
I created a separate logstash pipeline to "re-feed" elasticsearch . That is good and all but the DLQ files are still there on disk.
The documentation doesn't talk about this. I think it should.
Is this expected ?