Spark and Elastic node definition issue

Hi

i have a VM running elastic elasticsearch-2.1.1 and another RHL running spark spark-1.6.1-bin-hadoop2.6 and using the elastic-hadoop connector elasticsearch-hadoop-2.3.2

if i run spark on the same VM (IP 1.2.3.4) no probleme i can perform some sql using the park-shell
(although the memeory is not enough)
if i run the spark from my second machine IP 2.3.4.5 the initialtion work but the resulting command is always zero

scala> import org.elasticsearch.spark.sql._
import org.elasticsearch.spark.sql._

scala> import org.apache.spark.sql.SQLContext._
import org.apache.spark.sql.SQLContext._

scala> import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.SQLContext

scala> import org.apache.spark.ml.classification.LogisticRegression
import org.apache.spark.ml.classification.LogisticRegression

scala> val esConfig = Map("pushdown" -> "true", "es.nodes" -> "1.2.3.4", "es.port" -> "9200" , "path" -> "hfb")
esConfig: scala.collection.immutable.Map[String,String] = Map(pushdown -> true, es.nodes -> 1.2.3.4, es.port -> 9200, path -> orf_hfb)

scala> val df = sqlContext.load("org.elasticsearch.spark.sql", esConfig)
warning: there were 1 deprecation warning(s); re-run with -deprecation for details
df: org.apache.spark.sql.DataFrame = [@timestamp: timestamp, @version: string, ALARM: string, date: string, host: string, message: string, path: string, tags: string, type: string]

scala> sqlContext.sql("CREATE TEMPORARY TABLE myIndex USING org.elasticsearch.spark.sql OPTIONS (resource 'hfb', scroll_size '20')" )
16/06/17 20:40:59 ERROR NetworkClient: Node [127.0.0.1:9200] failed (Connection refused); no other nodes left - aborting...
org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'
at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:190)
at org.elasticsearch.spark.sql.SchemaUtils$.discoverMappingAsField(SchemaUtils.scala:76)
at org.elasticsearch.spark.sql.SchemaUtils$.discoverMapping(SchemaUtils.scala:69)
at org.elasticsearch.spark.sql.ElasticsearchRelation.lazySchema$lzycompute(DefaultSource.scala:110)

when i run the same from the local machine (i have installed also the same spark and pacjages) it works