Solved it. In case anyone stumbles on this thread and has the same problem, here are the steps I followed:
- In the init script of my cluster I pull the cert file
kubectl get secret "test-es-http-certs-public" -o go-template='{{index .data "tls.crt" | base64decode }}' > /tmp/tls.crt - Then, I add it to the keystore as
sudo keytool -import -alias elastic -storepass <default_pass> -noprompt -keystore $JAVA_HOME/lib/security/cacerts -file /tmp/tls.crt - Finally, in pyspark I did the following:
# In my case [/usr/lib/jvm/temurin-11-jdk-amd64/]
JAVA_HOME=os.getenv("JAVA_HOME")
options = {
"es.index.auto.create": "true",
"es.net.http.auth.user": "elastic",
"es.net.http.auth.pass": "<password>",
"es.nodes": "https://<loadbalancer_ip>:9200",
"es.nodes.wan.only": "true",
"es.net.ssl.cert.allow.self.signed": "true",
"es.net.ssl":"true",
"es.net.ssl.cert.allow.self.signed", "true"
"es.net.ssl.keystore.location", "file://{JAVA_HOME}/lib/security/cacerts"
"es.net.ssl.keystore.pass", "<default_pass>"
"es.resource": "foo/",
}
df.write.mode("overwrite").format("org.elasticsearch.spark.sql").options(**options).save()