What would be the best way to ship logs from one Logstash to another instance of Logstash? I am currently using http input and http output plugins. However, I am not sure if all the logs are being shipped without any logs being dropped reason being I see the following stack trace very often,
IMO the easiest way that is both very scalable and reliable is to use redis between Logstash instances/pipelines. Combining redis and the multi-pipline capabilities of Logstash 6.x, I have had over 40 pipelines cooperating seamlessly with each other in a microservices-like deployment model. Kafka provides more options, and it is awesome if you need those capabilities, but when it comes to a combination of simplicity and scale redis is the better choice.
Thanks for your quick response. We are planning to use Kafka as a messaging queue. I hope there won't be any issues when Kafka comes into picture. But it takes some time for Kafka to be in place so before that I wanted to figure out the issues associated with http plugin.
I've come across Lumberjack plugin but the problem here is I already configured Logstash on nearly 50 nodes(Linux and windows) with http input and output plugins. If at all I need to replace http with Lumberjack, then it would be on 50 nodes which is a painful task. When we start using Kafka, I will need to do it on 50 nodes but in the mean time I am trying to figure out the issue with http plugin instead of installing Lumberjack plugin.
Could you please tell me if there is a way for my problem?
I've come across Lumberjack plugin but the problem here is I already configured Logstash on nearly 50 nodes(Linux and windows) with http input and output plugins. If at all I need to replace http with Lumberjack, then it would be on 50 nodes which is a painful task.
Seriously, are you configuring those 50 machines by hand?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.