Writing from spark to elasticsearch fails

I am using a single node cluster ,installed elastic search and kibana on my local machine.
Created the logic in spark to read the csv files and create dataframes.While writing those into elasticsearch ,it is throwing an error

Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
** at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:250)**

conf settings given in the code are given below
spark.conf.set("es.index.auto.create", "true")
spark.conf.set("es.nodes.wan.only", "true")
spark.conf.set("es.nodes", "127.0.0.1")
spark.conf.set("es.port", "9200")

val cases_count_stats = spark.sql("select ...from cases_txn_cust_table group by case_identifier,case_description")
println("Writing to elastic")

cases_count_stats.saveToEs("case_summary1/cs_type")

This exception almost always has to do with your Spark driver or the executors not being able to reach your Elasticsearch cluster. If you are running with wan-only mode enabled against localhost, make sure that spark is only executing on your local machine. Also make sure that Elasticsearch is up and running and responding to simple requests. Attaching the TRACE level logs also might help in troubleshooting these problems.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.