we are using spark processing to load data to elastic search. we are using elastic cloud and our spark elastic search connection as below:
<
val conf = new SparkConf().setAppName("loaduserdata").setMaster("local")
.set("es.index.auto.create", "true")
.set("es.nodes","xxxx.eu-west-1.aws.found.io")
.set("es.port","9243")
.set("es.net.http.auth.user","xxx")
.set("es.net.http.auth.pass","xxx")
.set("es.nodes.wan.only","false")
/>
Getting below Error
//
Exception in thread "main" org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:327)
at org.elasticsearch.spark.sql.EsSparkSQL$.saveToEs(EsSparkSQL.scala:97)
at org.elasticsearch.spark.sql.EsSparkSQL$.saveToEs(EsSparkSQL.scala:80)
at org.elasticsearch.spark.sql.package$SparkDataFrameFunctions.saveToEs(package.scala:48)
at com.spark.tmmt.loadtoes$.main(loadtoes.scala:37)
at com.spark.tmmt.loadtoes.main(loadtoes.scala)
Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[34.253.152.23:9243]]
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:152)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:398)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:362)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:366)
at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:161)
at org.elasticsearch.hadoop.rest.RestClient.remoteEsVersion(RestClient.java:593)
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:320)
... 5 more