Connecting Spark to ES

I am getting the following exception while attempting to read from ES (elasticsearch-5.2.0) into Spark (spark-2.1.0-bin-hadoop2.7) on the local host using ES-to-Hadoop (elasticsearch-hadoop-5.1.2.jar):
"Cannot find node with id". The single ES master/data node is up (see below). What might cause the problem?

curl -XGET localhost:9200/_nodes/http
{"_nodes":{"total":1,"successful":1,"failed":0},"cluster_name":"elasticsearch","nodes":{"ltPIKS-wS7CJOxd_owuUpA":{"name":"ltPIKS-","transport_address":"127.0.0.1:9300","host":"127.0.0.1","ip":"127.0.0.1","version":"5.2.0","build_hash":"24e05b9","roles":["master","data","ingest"],"http":{"bound_address":["[::1]:9200","127.0.0.1:9200"],"publish_address":"127.0.0.1:9200","max_content_length_in_bytes":104857600}}}}

@yurik Could you please post the full stack trace as well as any logs from the run, and your configuration settings?

The following solves the problem:
./spark-shell --jars /elasticsearch-hadoop-5.2.0.jar --conf spark.es.nodes="localhost"

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.