Has anybody been able to use es-hadoop connector with an Es cluster where TLS is enabled?
If so, what configuration params did you use, and what kind of certificate?
I'm trying to connect to a 7.e ES cluster with xpack security enabled and TLS enabled. I'm using a CA and instance cert generated by the es certutil. verification mode = certificate, so all es nodes are using the same cert. The cert is in PKCS12 format. The es nodes all communicate with each other.
But I haven't beeen able to get es-hadoop to work using tls.
In spark conf, I enabled es ssl, specified keystore and truststore, using the same pkcs12 keystore that I'm using on the es nodes. I also specified PKCS12 as the keystore type.
But es rejects requests from es-hadoop.
Error
[o.e.t.TcpTransport ] [test-es-data-node1] exception caught on transport layer [Netty4TcpChannel{localAddress=0.0.0.0/0.0.0
.0:9300, remoteAddress=/10.0.1.81:44991}], closing connection
io.netty.handler.codec.DecoderException: javax.net.ssl.SSLException: Received fatal alert: internal_error
Is the certificate I'm using on the es nodes not valid for use by es-hadoop?
Is there some alternate configuration required?
Any other avenues to try?