So I am trying to use the Logstash-tcp-output plugin to send data from one logstash service to another using the logstash-tcp-input plugin. Everything works just fine when the receiving logstash tcp input is up and running, however when it is not (node goes down or logstash crashes or something unexpected happens) the sending logstash-tcp-output has to recreate the tcp connection and send another SYN. However, this is never ACKed as the receiving tcp input is down. Normally this would be fine.. and I would expect to lose data here until the receiving node recovers... but it seems that when it is in this state, the sending logstash blocks the entire output pipeline.
In my situation, this is a problem, as I would like the sending logstash to still correctly receive inputs and push to its other outputs (ie elasticsearch)... but that doesn't seem to be possible.
Is there a way to not make the tcp output plugin block the entire pipeline when a connection cannot be established. Or is there another output plugin I should be using for my scenario. Would the UDP plugin do what I desire?
Following up on my UDP comment.. I would rather not use this... as then I would have to expect data loss even when everything is up and running correctly.
So the question is.. can I reliably send data with TCP, but if the endpoint is down.. not block the entire pipeline?
Also. I am on Logstash 5.1.1 if thats necessary info.
@magnusbaeck hmm. So then I have to use UDP if I want to fire and forget? Or could I do something like pass events to another logstash instance running on the same host and then tcp to a remote host from that logstash instance? Or would that still ultimately back up the original pipeline?
@magnusbaeck What is interesting is the http output plugin doesn't seem to block the pipeline at all. It just prints an error to the log that the connection was refused from the remote host (aka my remote host was down). I feel like it should do something similar for tcp as I would rather not have the overhead of http to pass data from logstash on one host to logstash running on a remote host.
But at least for my use case, http will work for now.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.