Hi all,
I am collecting logs from multiple sources (Syslog, SNMP, Windows EventLog,) and storing them in Elasticsearch.
Now I need to forward these stored logs from Elasticsearch to multiple external destinations using Logstash.
For example:
- Kafka forward
- EventLog forward
- Syslog forward
- Secondary Syslog forward
To implement this, I pull events from Elasticsearch then forward them to each output destination.
My challenge is about reliable multi-destination forwarding.
Problem
If I use a field like isForwarded or forward_status to mark a log as "forwarded", then what happens in this situation:
- The log successfully forwards to 2 destinations
- Forwarding fails for the other 2 destinations
- But Logstash still marks the event as forwarded because the pipeline completed
This risks data loss, because failed destinations will never receive those logs again.
Questions
- What is the recommended pattern to reliably forward logs from Elasticsearch to multiple independent destinations?
- Is there a proper way to track per-destination forward status?
- Should I use separate pipelines per destination with
pipeline { }input/output? - What is the best practice to ensure that failure of one destination does not affect processing of others, but also does not lose logs?
- Does Logstash provide any built-in at-least-once guarantee when forwarding from Elasticsearch input?
Kindly guide me for the following process...
Thanks
Mahesh Kumar S
