I have setup a 2 node ES cluster as follows
node-1 : elasticsearch and kibana
node-2: elasticsearch
I have enabled https and tls level security on node-1 successfully on both between kibana and browser and between kibana and elasticsearch. However
When I try to add https to node-2 it is asking me for username and password.
Unable to make it join the cluster with node-1
I have generated a organisation level certificate in .pfx format for node-1 and using the same for node-2. Is this an issue? Should there be a new certificate issued for node-2?
Following is my es config from node-1 and node-2: node-1:
The ES logs on this node is also showing the following error in logs : io.netty.handler.codec.DecoderException: javax.net.ssl.SSLException: Received fatal alert: bad_certificate
Please be patient in waiting for responses to your question and refrain from pinging multiple times asking for a response or opening multiple topics for the same question. This is a community forum, it may take time for someone to reply to your question. For more information please refer to the Community Code of Conduct specifically the section "Be patient". Also, please refrain from pinging folks directly, this is a forum and anyone that participates might be able to assist you.
If you are in need of a service with an SLA that covers response times for questions then you may want to consider talking to us about a subscription.
It's fine to answer on your own thread after 2 or 3 days (not including weekends) if you don't have an answer.
Here's an excellent guide that I advise you to follow through:
Note that the author uses separate certificate/key files and not keystores, IMO it is easier to follow his approach, keystore and truststore concepts are harder to grasp for most users/admins/developers..
@hunsw - I have followed this blog and it says You can use the scp command to copy certificates from node1 to node2. Both nodes require the certificate and key in order to secure the connection.
So does that mean a SINGLE ssl certificate would have domain names of both node-1 and node-2. I am totally new to network security, hence pardon my silly questions.
Also can we use this certificate then on the logstash nodes as well since we would have to enable security there too?
It's ok. In the guide he uses node1 to generate certificates.
After the generation step, the certificate and key of node2 and the certificate authority (CA) certificate has to be copied (with scp for example) from node1 to node2.
You only need the CA certificate on Logstash, so your Logstash nodes can be sure that the certificate of node1 and node2 are issued by a trusted authority.
EDIT: unless you want mutual authentication (with client cert).
Yes, you can download it from the browser too, though it has to be on your nodes too somewhere. (Probably in the truststore file.)
Yes, for mutual auth you need the CA and the client cert too. Or you can just use a user/password combination for Logstash to authenticate in Elasticsearch and secure the channel with TLS.
Hi @hunsw - the setup is working fine from logstash, thank you for your help!
I am facing issues currently with nodes unable to form a cluster after enabling security and generating node1.pfx and node2.pfx
[node1] failed to establish trust with server at [<unknown host>]; the server provided a certificate with subject name [CN=node2...] [DNS:node2]; .......... certificate is not trusted in this ssl context ([xpack.security.transport.ssl])
@ikakavas - The config I am using is mentioned in this mail. I cannot mention the node names and other organisation specific details as it against the policy. If you need any information pls let me know.
The certificates have been generated by my organisation and both the node certificates have the same CA (i have confirmed this) , the certificates are in .pfx format.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.