Hello Experts,
We are using logstash as syslog receiver and forwarder both is done using TCP & UDP. Logstash outputs syslog event to a server whenever it receives syslog in input. But in some cases server may not be available so the events occured at that point of time should be buffered/stored and resend whenever server is available back again. For this purpose we are using dead letter queue but facing difficulty while setup.
We enabled dead letter queue in logstash.yml & path set to default i.e. logstash/data/dead_letter_queue.
It is also creating files 1.log, 2.log .... but all are empty files, and not sending events to output server whenever server coming online
Below is the logstash.conf file
input {
syslog {
port => 1470
}
udp {
port => 514
type => syslog
}
dead_letter_queue {
path => "/usr/share/logstash/data/dead_letter_queue"
commit_offsets => true
pipeline_id => "main"
}
}
output {
syslog{
host => "192.168.10.1"
port => 514
protocol => udp
}
}
In some articles its mentioned that dead letter queue only supports elastic search output, if this is the case then can you kindly suggest any workaround to achieve our solution
Do persistent-queue can be used to store syslogs when output server is unavailable and resend syslog again when output server comes online?
Following details for fast resolution:
Version: logstash 5.6.3
Operating System: ubuntu 14.04