Error while running pyspark connecting to elastic search

I am new to Pyspark and currently running pyspark and trying to connect to elasticsearch running on localhost.

Below are the details :

Elastic Details : localhost
Port : 9200
No https.

Code :

from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
df = spark.read.format("org.elasticsearch.spark.sql").option("es.nodes", "localhost").option("es.port", "9200").load("products")
df.show()

Command running via terminal :

spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.0 PySp.py

pyspark --version:
version 3.2.0
Using Scala version 2.12.15, OpenJDK 64-Bit Server VM, 11.0.21

Error :

df = spark.read.format("org.elasticsearch.spark.sql").option("es.nodes", "localhost").option("es.port", "9200").load("products")
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/readwriter.py", line 158, in load
  File "/opt/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", line 1309, in __call__
  File "/opt/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 111, in deco
  File "/opt/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", line 326, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o28.load.
: java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister: org.elasticsearch.spark.sql.DefaultSource15 Unable to get public no-arg constructor
	at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:582)
	at java.base/java.util.ServiceLoader.getConstructor(ServiceLoader.java:673)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1233)
	at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1265)
	at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1300)
	at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1385)
	at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)

Please can you help me how to resolve this issue.

Hi @kashi_mn. This looks like the same error we saw at Pyspark-Elasticsearch connectivity and latest version compatibilty - #6 by Bramha. It's worth double-checking which es-spark jar you're giving it.

Hi @Keith_Massey,
Thanks for the response. I am running elastic instance of 8.11.1 version.
I see that you have already recommended on the mentioned thread about the spark, elastic-spark connector versions. Will try the same.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.