Hi,
I have setup a cluster with 3 nodes on which I have installed Elasticsearch. On one of the nodes I have installed Kibana also. I would like to know, how would I configure this setup such that Kibana may read from the cluster(any of the Elasticsearch nodes). From my understanding, the url determines from which elasticsearch node the data is accessed from (Eg. elasticsearch.url: "http://localhost:9200").
Is this understanding correct? If so, this means that if the elasticsearch instance on the localhost fails, Kibana won't be able to read the data.
If I want a setup such that Kibana can read from any of the nodes in the cluster, how do I achieve this?
Thanks in advance.
Kibana will make requests to the node at elasticsearch.url, and that node may end up talking with other nodes to retrieve data. Running an elasticsearch coordinating node on the same machine(guide) is currently recommended for balancing. If that node does goes down then correct, Kibana will go down with it.
Kibana doesn't support talking directly to multiple elasticsearch nodes now, we have an issue for tracking it here that has some suggestions. You could put a load balancer between kibana and elasticsearch, or have multiple instances of Kibana pointing to different nodes with a load balancer in front.
Thanks for the reply. It was pretty useful.
I have gone through the load balancing mechanism and am in the process of implementing it.
For the following configuration change in the elasticsearch.yml files(for balancing)
transport.host: external ip
transport.tcp.port: 9300 - 9400
Should this change be done for all nodes in the cluster?
And is the referring to the IP of the coordinator node?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.