Sending data to elasticsearch

Hi,

Below is the configuration I have for my 2 node elasticsearch cluster.

Pls let me know how I should configure the logstash elasticsearch output. Should the data be sent to both the below nodes?

I am fully aware that a 2 node setup is not the best way forward, but my company has resource constraints hence this is the best we can setup as of now.

node-1

cluster.name: es-cluster
node.name: node1
path.data: E:/elasticsearch/data
path.logs: E:/elasticsearch/logs
network.host: node1.org
discovery.seed_hosts: ["node1","node2"]
cluster.initial_master_nodes: ["node2"]
node.master: true
node.data: true

node-2

cluster.name: es-cluster
node.name: node2
path.data: E:/elasticsearch/data
path.logs: E:/elasticsearch/logs
network.host: node2.org
discovery.seed_hosts: ["node1","node2"]
cluster.initial_master_nodes: ["node2"]
node.master: true
node.data: true

Hi,
Yes I would recommend to configure the output to both nodes.
I would also recommend to have a look at ilm (index lifecycle management).
The output configuration depends for each infrastructure. But in the documentation you can find multiple examples how to configure (with ilm/ without ilm, with ssl/ without ssl, etc.)

Regards,
Simon

Ok thank you, i will take a look at that

I have enabled ssl/tls across the elk stack, what else do you recommend i do for a 2 node setup @KoettingSimon

You already have configured the 2 nodes to be both, data and Master node.
So the only thing I further could recommend is to set the replica of all indexes to 1 (Default setting).

Best,
Simon

Sure, confirmed regarding the replicas, set to 1 :slight_smile:

Ok, when the data the sent to only one node in a cluster setup it gets replicated on the other node right? Why do we then need to send data to both nodes? @KoettingSimon

You do not send it to both nodes, you send it to either node, which allows you to load balance. If you had more than 2 nodes this is how you achieve high availability.

Thanks @Christian_Dahlqvist. I will send it to only the master node.

Also, regarding encrypted communications between both the elasticsearch nodes. Do i need to generate 2 separate ssl certificates and add it this way -
ssl cert of node1 in elasticsearch.yml of node2 AND
ssl cert of node2 in elasticsearch.yml of node1

OR

just a single ssl certificate which contains domain names of both node1 and node2.

I currently have kibana installed on node2 and have the domain name as node2.org, so this should point to node1 as well right?

Hi @nityaraj06

Yes, thats right.

As Christian said, you do not send the data to both nodes simultaneously. If you configure multiple hosts in the output configuration Logstash will load-balance between the host.

From the documentation:

Sets the host(s) of the remote instance. If given an array it will load balance requests across the hosts specified in the hosts parameter.

So you dont need to configure the output to both host, but i recommend it to use the load balancing.

As i can see in the config you posted, both nodes are masters, so which node do you mean?

You need to create a certificate for each node and configure in the Elasticsearch.yml the certificate of the node, so on node1 you configure the node1 certificate.
Each node need to trust the issuing CA.
To get started with Elastic-Stack security i recommend this blog article.

2 Likes

I'm a little confused here, so should I add both hosts or no? I would like the setup to be load balanced

Yes, if you want to use load balancing you need to configure both hosts in the elasticsearch output.

ok thanks for all your help!

Also when i hit the kibana link the request would go to either node1 or node2 if i configure both in logstash output (since that would make it load balanced) ?

No, you just configure it for logstash.
You can also configure multiple host in the kibana config, so kibana will also load balance.

Ok so I have already set this in kibana.yml

elasticsearch.hosts: ["http://node1:9200","http://node2:9200"]

You need to create a certificate for each node and configure in the elasticsearch.yml the certificate of the node, so on node1 you configure the node1 certificate.
Each node need to trust the issuing CA.

Since my setup will be load balanced now, do i still have 2 get 2 separate certificates or can i include both node1 and node2 domains in the single certificate. I have raised a request for a single cert for node2, just thinking if i should include node1 in it or raise a separate cert for node1.

As i can see in the config you posted, both nodes are masters, so which node do you mean?

I meant node2 but now i will add both nodes in my logstash output

Im not an SSL-Pro but as i can read here this should be possible.
But why do you want to use the same certificate? I would recommend to generate seperate certificates.

Hi @KoettingSimon as suggested I have got separate ssl certificates for node1 and node2 and I have configured node1 ssl certificate in node1 and likewise for node2.

However, I am receiving the below error now:

[server] failed to establish trust with server at []; certificate is not trusted in this ssl context ([xpack.security.transport.ssl])

node1

xpack.security.enabled: true

xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: E:/node1.pfx
xpack.security.http.ssl.truststore.path: E:/node1.pfx 

xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: E:/node1.pfx
xpack.security.transport.ssl.truststore.path: E:/node1.pfx 

node2

xpack.security.enabled: true

xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: E:/node2.pfx
xpack.security.http.ssl.truststore.path: E:/node2.pfx 

xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: E:/node2.pfx
xpack.security.transport.ssl.truststore.path: E:/node2.pfx 

Is the issuing CA-Certificate included in the pfx files?

yes it is included

I am currently getting this error :

java.security.cert.CertificateException: No subject alternative names matching IP address <ip-add> found

Sounds like you trying to access via the IP and SSL but in the Certificate the IP is not included as a subject alternative name.
Some infos about the subject alternative name you can find here.

I haven't mentioned any IP(only DNS name which matches SAN of that node) in elasticsearch.yml so not sure why I am receiving this error.

Should I try adding the certificate of node2 in elasticsearch.yml settings