In our Elastic environment we use multiple pipelines on a single Logstash instance. Also, we enabled the Dead Letter Queue on all these pipelines. On creating the pipeline to process events in the DLQs I noticed I need to specify the name of the pipeline I want to process.
So, do I need to create a DLQ processing pipeline for each of the pipelines we have on our instance? Or is there a way to have one DLQ processing pipeline to handle all of them? In my opinion that would be a more efficient solution (as each pipeline will consume CPU and memory resources).
You can have multiple inputs in a pipeline.
input {
dead_letter_queue {
configuration 1
}
dead_letter_queue {
configuration 2
}
dead_letter_queue {
configuration N
}
}
This way you can have all your dead letter queue processing in just one pipeline. You can also limit the resources for each pipeline if you use the pipeline.workers
option in pipelines.yml
.
Tried it and it works perfectly this way. Thank you.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.