Spark and elasticsearch-hadoop-2.0.0

Hi!
I am trying to setup a spark + elasticsearch solution. I am trying the
simple example done by Barnaby Gray here [1], trying to adapt it to the
elasticsearch-hadoop 2.0.0. I get some connectivity error that I am not
able to overcome. I get the following error:

14/06/16 17:11:13 INFO HttpMethodDirector: I/O exception
(java.net.ConnectException) caught when processing request: Connection
timed out
14/06/16 17:11:13 INFO HttpMethodDirector: Retrying request

For the complete log: https://gist.github.com/sallum/6792a0505d107ea1e47f

I have the following scala code:

I am sure that there is connectivity and the port is open in the address
used:

$machine~> telnet 172.17.10.20 9200
Trying 172.17.10.20...
Connected to 172.17.10.20.
Escape character is '^]'.

Has anyone faced the same issue? Any ideas what can it be?

Thanks in advance!
Ignacio

[1] http://loads.pickle.me.uk/2013/11/12/spark-and-elasticsearch.html

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9a3ef325-b2f0-46b6-92af-e243932935ab%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.