Writing to Elasticsearch from Spark failing

Hi,
Am trying to write to elastic search from Spark running on my local. I have a data frame df and am doing this

df.write.format("org.elasticsearch.spark.sql").option("es.nodes","abc-es.xyz.com").option("es.port","9200").option("es.mapping.id","id-field").mode("append").save("my_index_alias/doc")`

when I'm doing this specifying "es.nodes" to "localhost", its working fine and I can see documents on my local Elasticsearch. But I need to write to our actual Elasticsearch cluster.
I am able to access the elastic search cluster when accessing it on browser "https://abc-es.xyz.com",
am seeing a json on browser with name, cluster-name,cluster-uuid, version, tagline.

But from Spark am getting

".. INFO HttpMethodDirector: I/O exception (java.net.ConnectException) caught when processing request: Connection refused..
....
...
ERROR NetworkClient: Node [<IP of abc-es.xyz.com>:9200] failed (Connection refused...."

Any idea what could be the issue here??

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.