No route to ElasticSearch from Spark in Standalone mode

(Marco Trivolli) #1

I'm working on some simple tests with Spark (standalone mode) and ElasticSearch. All my tests work fine as long as ElasticSearch is running in the same host as Spark. If I move ElasticSearch to a different node, then I get an exception " ..."

I have discarded any network / firewall issues, since I'm able to connect using kibana to the remote ElasticSearch node.

The test is as simple as this:

val spark = SparkSession.builder.master("local").appName("testing")
   .config("es.nodes", "remotenode")


I'm using Spark 2.11 with elasticSearch-spark-20_2.11-5.3.2.

Any help?... I don't know what else to try!

(Jimmy Kuang) #2

On your remote ES node run the following:

nslookup remotenode


what is the output?

(Marco Trivolli) #3


** server can't find remotenode: NXDOMAIN```

Non-authoritative answer: name =

Authoritative answers can be found from: nameserver = nameserver =

I don't have a DNS configured for my test network... Shall I?

(Jimmy Kuang) #4

Yes, you should configure DNS if you plan to use hostnames as the problem is it can not resolve what is "es.nodes" = "remotenode" . The alternative is you can always use the IP address , e.g. "es.nodes" = ""

(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.