Sending Logstash data to multiple servers

Hi!
I would like to know if it is possible to send collected data from Logstash to multiple destinations outside of my cluster which run Elasticsearch.

Many thanks!

Hi @The1,

it is. You can configure several Elasticsearch outputs. Note, that if there is any problems with one of the outputs, Logstash will stop shipping to all outputs. This might be possible to workaround using pipelines but I do not know anything about Logstash pilelines.

Hi A_B,

wow, that was a quick reply. Thanks for the hint! The other idea I had was to configure Filebeat to send logfiles to different locations external to the cluster network with multiple Logstash/Elasticsearch instances. Would that work?

I think using filebeat to push to multiple Logstash/Elasticsearch will work. That's exactly what I'm doing - but remember to configure your loadbalance: true in filebeat.yml. This caught me out and it took 2 days to realise what was missing :slight_smile:

I hope this helps

output.logstash:
  hosts: ["server1:5044", "server2:5044"]
  loadbalance: true

I would expect loadbalance in the Filebeat config to mean, use these Logstash instances that forward to the same ES cluster to loadbalance over. If the Logstash instances are configured for different Elasticsearch clusters, you would only have partial data in each destination cluster.

With multiple output definitions in Logstash, every message will the shipped to every output.

Thank you very much for your comments.

@ A_B. Could you further clarify in which scenario you would expect partial data? If logstash configured to send files to multiple Elasticsearch instances or filebeat configured to send to multiple Logstash instances?

I'm not 100% sure about this...

In normal situations Filebeat would be configured to send logs to e specific ES cluster. Using the loadbalance setting would in this case mean Filebeat can choose one of the configured Logstash instances to ship to. All Logstash instances would usually be configured to send to the same ES cluster.

IF you would configure Filebeat with Logstash instances that ship logs to different ES cluster (as in logstash1 -> es_cluster1, logstash2 -> es_cluster2, etc.), you would get some logs shipped to es_cluster1 and some to es_cluster2. In this example, Filebeat (I assume) does not send its logs to both logstash1 AND logstash2, only to one or the other. I would not expect anyone to configure Filebeat and/or Logstash this way, just saying...

Hope that makes sense...

And just to clarify, I interpreted the OP to mean, the data should be replicated one to many (one for each destination). If it is meant to be spread between destinations one, to any (of any possible destination)., then never mind what I have said above... :slight_smile:

Hi A_B

The request I am working on is to configure the ELK stack in a way to make the log data accessible at multiple destinations. The idea is to have multiple ELK stacks installed on different clusters. The idea is to make the logfiles collected by Filebeat accessible from ELK stacks which are located outside of the cluster from which Filebeat collects the logfiles from. Hence, the logfiles sent from Filebeat should be stored outside of the Filebeat cluster. The idea is to securely save log data if the cluster in which Filebeat runs closes down. Does my question make sense?

Right... This is a few years old but should still be true https://www.elastic.co/blog/clustering_across_multiple_data_centers

The What Are The Options ? part.

This might also be of interest https://www.elastic.co/blog/hot-warm-architecture-in-elasticsearch-5-x

It is not very cost effective to duplicate logs but if your use case requires the redundancy and uptime then one of the links above have some suggested solutions.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.