All,
I have configured my primary pipeline to process events from beats plugin and write it to Dead Letter Queue whenever there're any issues.
Then I have a secondary pipeline to read from the DLQ and send it to Elastic Search in a separate index.
secondary pipeline:
input {
dead_letter_queue {
path => "/var/lib/logstash/dead_letter_queue"
commit_offsets => true
codec => "json"
}
}
output {
elasticsearch {
document_type => "dlq"
hosts => ["es host"]
index => "dlq-%{+YYYY.MM.dd}"
}
}
When i enabled this , it's throwing:
Pipeline aborted due to error {:exception=>java.nio.file.NoSuchFileException: /var/lib/logstash/dead_letter_queue/main/60.log, :backtrace=>["sun.nio.fs.UnixException.translateToIOException(sun/nio/fs/UnixException.java:86)", "sun.nio.fs.UnixException.rethrowAsIOException(sun/nio/fs/UnixException.java:102)", "sun.nio.fs.UnixException.rethrowAsIOException(sun/nio/fs/UnixException.java:107)"
I am wondering whether there's a caching behavior on DLQ plugin which looks for the file which was being processed last time and resume from there? currently I don't have any file called 60.log in the referenced path in the error.
I tried setting both commit_offsets => true / false, still the same error persists.
Is there a way I can completely reset the DLQ plugin to start from the scratch?
Logstash version: logstash 5.5.0
Thanks for helping