Bulk documents into Elasticsearch with pyspark

Hello,

I want to load a big document that contain many jsons into an existing index in elasticsearch with pyspark.

I have a dataframe and I try to load many jsons with this command:

dataframe.write.format("org.elasticsearch.spark.sql").option("es.nodes" ,ESnode).option("es.port" ,ESport).option("es.resource" ,ESresource)

ESnode= "endpoint-private"
ESport="privateport"
ESresource= "glue/glue"

The process not failed but not load any document.

How I can load a big document that contain many jsons into ES with pyspark?

Thanks.

King regards.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.