The error should be self explanatory - Elasticsearch cluster is not
accessible. Make sure that 1.1.1.1 is accessible from the Spark
cluster and that the REST interface is enabled and exposed.
my spark job running
this is a error
org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error
(check network and/or proxy settings)- all nodes failed
my code
val conf = new SparkConf()
conf.set("es.nodes", "1.1.1.1")
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.