2 instances of Elastic and Kibana but data is shared


(Erezinho) #1

Hello,

I have a vm with two instances of Elastic, on different folders. Each instance has its own port.
Same goes to the two Kibana instances.
ElasticA-KibanaA
ElasticB-KibanaB.

My client uses both elastic instances, depending on the execution context.
The index has the same name runner.

Now, once I access KibanaA, sets the index and define a dashboard, it affects also KibanaB i.e. it's like the data is shared between the two elastic instances while is should be different.

using different machines will of course work but I'm trying to avoid that.
What do I miss here ?
Any better way to allow the separation of data ?

Thanks!


(Zachary Tong) #2

If the two Elasticsearch nodes are in different clusters, they don't share data. I would recheck your configuration and make sure they haven't joined together to create a single cluster.


(Erezinho) #3

Thanks.
How do I check that?
Can you direct me to any reference regarding this?


(Zachary Tong) #4

The simplest way would be to call the Cluster Health API from either node: https://www.elastic.co/guide/en/elasticsearch/reference/current/cluster-health.html

If that health returns saying there are two nodes in the cluster, they have joined together. Otherwise, executing it against both nodes will show only one node in each cluster.

How did you configure the nodes? This is something that is configured in the elasticsearch.yml file, so you should know if they are sharing the same cluster name, or using each other's IP addresses in the unicast list, etc.


(Erezinho) #5

That helped me to find & solve the problem. Thanks!

So the clusters were joined together.

For those who'll find this topic relevant - here's how I solved it:
Open elasticsearch.yml, remove the comment from cluster.name: and supply a unique name
Repeat for the other instance of elastic.
Restart each elastic.
Re-check the cluster Health and make sure each one has 1 node (and also the cluster name is as you set it).


(Zachary Tong) #6

Awesome, glad you got it resolved :slight_smile:


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.