I am using a single node cluster ,installed elastic search and kibana on my local machine.
Created the logic in spark to read the csv files and create dataframes.While writing those into elasticsearch ,it is throwing an error
Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
** at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:250)**
conf settings given in the code are given below
spark.conf.set("es.index.auto.create", "true")
spark.conf.set("es.nodes.wan.only", "true")
spark.conf.set("es.nodes", "127.0.0.1")
spark.conf.set("es.port", "9200")
val cases_count_stats = spark.sql("select ...from cases_txn_cust_table group by case_identifier,case_description")
println("Writing to elastic")
cases_count_stats.saveToEs("case_summary1/cs_type")