Spark scala "saveToEs" got the wrong ip

Hi,
I am trying spark and elasticsearch.
setting:
windows 10 pro
spark version: 2.4.0
scala version: 2.11.12
ELK version: 7.10
start like so:

.\spark-shell --jars C:\spark\jars\elasticsearch-spark-20_2.11-7.10.0.jar

code:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.elasticsearch.spark._
val elasticIndex = "pmolog"
val url = "http://109.111.21.26:9200"

val reader = spark.sqlContext.read.
format("org.elasticsearch.spark.sql").
option("es.nodes.wan.only","false").
option("es.net.http.auth.user","elastic").
option("es.net.http.auth.pass","changeme").
option("es.nodes",url)

val df = reader.load(elasticIndex)

try to index into elasticsearch like below:

import org.elasticsearch.spark.sql._
df.saveToEs("demoindex")

encounder error: (the url is different in the error message:)

scala> df.saveToEs("demoindex")
2021-05-06 16:58:47 ERROR NetworkClient:155 - Node [127.0.0.1:9200] failed (Connection refused: connect); no other nodes left - aborting...

May i know how do i make sure it is looking for my url 109.111.21.26:9200 ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.