Es-hadoop with TLS

Has anybody been able to use es-hadoop connector with an Es cluster where TLS is enabled?

If so, what configuration params did you use, and what kind of certificate?

I'm trying to connect to a 7.e ES cluster with xpack security enabled and TLS enabled. I'm using a CA and instance cert generated by the es certutil. verification mode = certificate, so all es nodes are using the same cert. The cert is in PKCS12 format. The es nodes all communicate with each other.

But I haven't beeen able to get es-hadoop to work using tls.

In spark conf, I enabled es ssl, specified keystore and truststore, using the same pkcs12 keystore that I'm using on the es nodes. I also specified PKCS12 as the keystore type.

But es rejects requests from es-hadoop.

Error

[o.e.t.TcpTransport ] [test-es-data-node1] exception caught on transport layer [Netty4TcpChannel{localAddress=0.0.0.0/0.0.0
.0:9300, remoteAddress=/10.0.1.81:44991}], closing connection
io.netty.handler.codec.DecoderException: javax.net.ssl.SSLException: Received fatal alert: internal_error

Is the certificate I'm using on the es nodes not valid for use by es-hadoop?
Is there some alternate configuration required?
Any other avenues to try?

Thanks for sharing your issue with us. We're happy to help out where we can, but please avoid opening multiple topics for the same issue as it can lead to clutter on the forums.

Since this is a decoding issue on the server side, you might have better luck asking around the security forums on here. I understand from your previous issue you mentioned that you can successfully connect with the rest client, but without more information from the server side it's difficult to see what ES-Hadoop might be doing wrong/what configs need to be changed/etc...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.