Questions about dead letter queue

Hi there, we've recently started looking into setting up a dead_letter_queue pipeline at Discourse but I had some questions which wasn't immediately clear from the documentation.

  1. Does the DLQ log files get deleted at all? From what I'm seeing currently, the log files remain on disk even after it has been processed by a pipeline. Also, when does logstash decide to create a new DLQ log file? (1.log, 2.log etc...)

  2. What is the recommended practice for setting up a dead_letter_queue pipeline? Since the DQL log files are not really human readable, is it recommend to setup the pipeline containing by a stdout and elasticsearch output?

  3. Is it recommended to set commit_offset to true for monitoring purposes and then flick it to false once you're ready to fix the underlying bug?

  4. When commit_offset is true, where does logstash save the state?

Thank you in advance!

Hi @tgxworld,

Currently, the DLQ logs do not get deleted automatically - this is a feature that will be added in a future Logstash release.

A new DLQ log file is created either a) when Logstash is started up, or b) when a new event would cause the log file to be larger than the 'maximum segment size' - this is currently set to 10MB.

To view the Dead Letter Queue there is a dead letter queue input plugin. This can be processed like any other Logstash input - initially you can pipe to stdout, or similar, using the rubydebug filter (with metadata enabled) to determine the errors. Then you can use the mutate filter to remove any mapping errors (as shown here)

The commit_offset flag when set to true will cause the last processed DLQ entry to be stored in the .sincedb file stored in the ${}/plugins/inputs/dead_letter_queue/${} folder (pipeline_id will be main unless it has been manually set. If the commit_offset flag is set to false, these offsets will not be stored.




This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.