Hey,
I'm deploying ELK stack in big enterprise organization.
is there any way to send the data to DR elasticsearch database?
- the data should be the same in both ES servers but the kibana should only look at the main ES untill crysis
Hey,
I'm deploying ELK stack in big enterprise organization.
is there any way to send the data to DR elasticsearch database?
People often use something like Kafka then use logstash in both DC to read from Kafka and index in elasticsearch.
What kind of data are you indexing?
Does snapshot and restore would be enough? Like sync every 15 minutes?
Hey David,
We are indexing data coming from SIEM server.
We need to send the same data from the SIEM to 2 ES servers 1 Prod and 1 DR, so practically the same log file/ Data will be stored twice in case of Prod ES failure.
the architecture is somthing like that: 2 identical ES clusters get the same data from 1 Logstash server that can be seen from 1 kibana server. the logstash and kibana servers can use vmotion but to vmotion ES server with 20TB of data is pretty rough .
my question is what is the best way to plan the DR? i have seen some suggestions about Tribe Nodes. (Nightly backup/Snapshots to the ES DB is not an option because the amount if storage will be generated is too big).
As you are using logstash, you can I guess send the data to both elasticsearch clusters using something like:
output {
elasticsearch { }
elasticsearch { }
}
Never tested but that might work.
About your point 4, I think that it will be actually more or less the same. Snapshots are incremental by nature.
Although you might end up sometime copying big segments from one location to another.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.