What is the best way to ship logs from one Logstash to another Logstash?


(Amruth) #1

Hi,

What would be the best way to ship logs from one Logstash to another instance of Logstash? I am currently using http input and http output plugins. However, I am not sure if all the logs are being shipped without any logs being dropped reason being I see the following stack trace very often,

{:timestamp=>"2017-12-18T15:13:40.243000-0500", :message=>"[HTTP Output Failure] Encountered non-200 HTTP code 200", :response_code=>502, :url=>"http://xxx.xx.xxx.xx:xxx", :event=>#<LogStash::Event @cancelled=false>, :level=>:error}

I don't understand the reason for this error. How can I ensure that all the logs are being shipped without being dropped? Can someone please suggest?

Thanks


(Robert Cowart) #2

IMO the easiest way that is both very scalable and reliable is to use redis between Logstash instances/pipelines. Combining redis and the multi-pipline capabilities of Logstash 6.x, I have had over 40 pipelines cooperating seamlessly with each other in a microservices-like deployment model. Kafka provides more options, and it is awesome if you need those capabilities, but when it comes to a combination of simplicity and scale redis is the better choice.


(Amruth) #3

Hi Robert,

Thanks for your quick response. We are planning to use Kafka as a messaging queue. I hope there won't be any issues when Kafka comes into picture. But it takes some time for Kafka to be in place so before that I wanted to figure out the issues associated with http plugin.

Thanks


(Christian Dahlqvist) #4

I believe the recommended way is to use a lumberjack output plugin with a beats input plugin, as these should be compatible.


(Amruth) #5

Hi Christian,

I've come across Lumberjack plugin but the problem here is I already configured Logstash on nearly 50 nodes(Linux and windows) with http input and output plugins. If at all I need to replace http with Lumberjack, then it would be on 50 nodes which is a painful task. When we start using Kafka, I will need to do it on 50 nodes but in the mean time I am trying to figure out the issue with http plugin instead of installing Lumberjack plugin.

Could you please tell me if there is a way for my problem?

Thanks


(Magnus Bäck) #6

I've come across Lumberjack plugin but the problem here is I already configured Logstash on nearly 50 nodes(Linux and windows) with http input and output plugins. If at all I need to replace http with Lumberjack, then it would be on 50 nodes which is a painful task.

Seriously, are you configuring those 50 machines by hand?


(Amruth) #7

I have logs coming from all the 50 machines. What would be the alternative?

I assume it to be Ansible.


(Magnus Bäck) #8

Ansible or a similar tool, yes. Don't ever configure machines by hand.


(Amruth) #9

Hmm...

Would you happen to know why there are issues with Logstash HTTP plugins? Is there any way where Logstash can say "No logs are being dropped"?


(Josh Speer) #10

This has always been my way of doing this, until I had to encrypt the data. Open source redis doesn't support TLS. So trying out lumberjack now.


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.