Dead Letter Queue can't be enabled

Hi Team,
I am trying to use dead letter queue feature and want to create a new index for dlq input events.
Here are my configurations:
in logstash.yml:
path.data: /usr/share/logstash/data
dead_letter_queue.enable: true
dead_letter_queue.max_bytes: 1024mb

in pipelines.yml:

  • pipeline.id: main
    pipeline.workers: 3
    dead_letter_queue.enable: true
    dead_letter_queue.max_bytes: 1024mb
    path.config: "/usr/share/logstash/pipeline/logstash.conf"
  • pipeline.id: test
    dead_letter_queue.enable: false
    path.config: "/usr/share/logstash/pipeline/deadletter.conf"

I have my main configuration file logstash.conf at location /usr/share/logstash/pipeline/ which has inputs from beats, kafka and s3.
While my deadletter.conf is also at /usr/share/logstash/pipeline/ and it's configuration has:
input {
dead_letter_queue {
path => "/usr/share/logstash/data/dead_letter_queue"
commit_offsets => true
pipeline_id => "test"
}
}

output {
elasticsearch {
hosts => "elasticsearch"
manage_template => false
index => "deadletterlog-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

Now when i am running logstash through docker, i am getting the error:
[2018-02-26T07:40:13,825][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
[2018-02-26T07:40:13,862][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2018-02-26T07:40:13,874][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elasticsearch"]}
[2018-02-26T07:40:13,878][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"test", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x1bf1f716@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2018-02-26T07:40:13,921][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"test", :plugin=>"<LogStash::Inputs::DeadLetterQueue path=>"/usr/share/logstash/data/dead_letter_queue", commit_offsets=>true, pipeline_id=>"test", id=>"a3d25d3edc326dca05c593696922858580e88e8db65dcadcf89381251ae59511", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_e4bc4a24-8034-4bbf-b80a-438f50a5f5d6", enable_metric=>true, charset=>"UTF-8">>", :error=>"/usr/share/logstash/data/dead_letter_queue/test", :thread=>"#<Thread:0x1bf1f716@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2018-02-26T07:40:14,878][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"test", :exception=>java.nio.file.NoSuchFileException: /usr/share/logstash/data/dead_letter_queue/test, :backtrace=>["sun.nio.fs.UnixException.translateToIOException(sun/nio/fs/UnixException.java:86)", "sun.nio.fs.UnixException.asIOException(sun/nio/fs/UnixException.java:111)", "sun.nio.fs.LinuxWatchService$Poller.implRegister(sun/nio/fs/LinuxWatchService.java:246)", "sun.nio.fs.AbstractPoller.processRequests(sun/nio/fs/AbstractPoller.java:260)", "sun.nio.fs.LinuxWatchService$Poller.run(sun/nio/fs/LinuxWatchService.java:329)", "java.lang.Thread.run(java/lang/Thread.java:748)"], :thread=>"#<Thread:0x1bf1f716@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2018-02-26T07:40:14,891][ERROR][logstash.agent ] Failed to execute action {:id=>:test, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:test, action_result: false", :backtrace=>nil}

Can anyone help me with the error. Any help and guidance will be appreciated.
Thanks in Advance!

Hi @manya12,

I suspect the issue you are having here is the pipeline_id defined in the dead_letter_queue input - the pipeline_id here should be the name of the pipeline you want to read from, not the pipeline running the dead_letter_queue plugin. In your case this appears to be the pipeline with id main, so try setting the pipeline_id to main

Hope this helps!

Rob

@RobBavey
Thanks alot. It worked for me.:slight_smile:
One more thing i need to ask.
Now when i have dead letter queue enabled and my input from dead letter queue is sending output to new index, my logs in it do not contains metadata and error why the log has been into inputted to dead letter queue.
Is there any other configuration i need to do for that.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.