We have a client, they sell goods globally and wanted to use Elasticsearch as Search Engine.
They have their databases in one region but wanted the search performance fast in multiple regions (Europe, North America, Asia....etc). The Search Engine and all components are on Google cloud Platform (GCP). Here are some ideas but need help:
Replicate the DBs to all regions
Have an ELK cluster on VMs in each region connecting to the replicated DB in the same region
How should we integrate all search requests and results from multiple ELK clusters to their Global Web FrountEnd? GCP Global load balancer(GLB) with CDN? How to setup GCP GLB with multiple ELK clusters in different regions?
I guess ELK Replication is more expensive than DB replication. Either way, the last step is that two ELK clusters in different regions need to receive search requests and send the reach results to the same ONE single web FrontEnd. So the question would be that how we should do to accomplish that. I hope it is doable and wanted to find out how...
It is doable, but it is an infrastructure/architecture question out of the scope of the forum.
You basically need a load balancer that will receive the requests and can send them to each one of your clusters, your application will only talk with this load balancer.
But how you will implement this and if this will work as you expect depends entirely on your infrastrucutre and architecture.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.