I understand from documentation that when I get this error
SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 4 times, most recent failure: Lost task 0.3 in stage 20.0 (TID 75, 10.244.241.30, executor 4): org.elasticsearch.hadoop.rest.EsHadoopParsingException: org.elasticsearch.hadoop.EsHadoopIllegalStateException: Field 'metadata.a.b.c.d' not found; typically this occurs with arrays which are not mapped as single value
I should do
spark.read.xx.
.option("es.read.field.as.array.include", "metadata.a.b.c,metadata.a.b.c.d,metadata.a.b.c.d.e")
But it keep throwing the error even I include the option above.
Apache Spark 2.4.5,
Scala 2.11
ES hadoop 7.6.2
Any thoughts?