Error on indexing remote es cluster using spark on ES 5 alpha3

Hi ,
I am getting the following error when I try to index a remote cluster using spark. It works fine locally in my Mac when I try to index on the elasticsearch instance on the Mac.I tried passing the es.nodes.wan.only as true. But still the same error.

executor localhost: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException (Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only') [duplicate 1]

I am also getting the same error. Will your problem resolved.?

I'm having this issue as well. Did you get it resolved?

I have the same issue. The version is as follows:
1.es-hadoop: 5.2.1
2.spark : 2.1
3:scala:2.11

scala script:
import org.apache.spark.SparkConf
import org.elasticsearch.spark._

val conf = new SparkConf()
conf.set("es.index.auto.create", "true")
conf.set("es.nodes", "10.10.13.120")
conf.set("es.nodes.wan.only", "true")
val numbers = Map("one" -> 1, "two" -> 2, "three" -> 3)
val airports = Map("OTP" -> "Otopeni", "SFO" -> "San Fran")
sc.makeRDD(Seq(numbers, airports)).saveToEs("spark/docs")

And I had captured packets via strace. I found that no connection was created by the shell(connect function was not invoked).