Setting up DR using Kafka


(Martin Michelsen) #1

Hi,

I'm looking to set up an Elastic DR instance where I want to have both instances running simultaneously with all events at both sites. The two data centers are about 30 miles apart and connected over fiber. We are using FileBeat and WinLogBeat to transport the event data. The desire is for both instances to have the same view of the data within a modest time window normally and relatively soon after restoration from an outage. I read the blog on using Kafka to send the events to both sites (see https://www.elastic.co/blog/scaling_elasticsearch_across_data_centers_with_kafka). It's the second proposed architecture that I am interested in which involves writing to both local and remote Kafka queues and logstash reading from both of them.

My question is I am unclear how to implement this architecture. My understanding of Beats use of Kafka is that it can only write to one Kafka instance. What would be the best way to implement this approach?

Marty


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.