multiple values

I understand from documentation that when I get this error

SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 4 times, most recent failure: Lost task 0.3 in stage 20.0 (TID 75,, executor 4): org.elasticsearch.hadoop.EsHadoopIllegalStateException: Field 'metadata.a.b.c.d' not found; typically this occurs with arrays which are not mapped as single value

I should do
.option("", "metadata.a.b.c,metadata.a.b.c.d,metadata.a.b.c.d.e")

But it keep throwing the error even I include the option above.

Apache Spark 2.4.5,
Scala 2.11
ES hadoop 7.6.2

Any thoughts? is a finicky property that can cause some very confusing serialization errors in the connector. Can you post a mapping and an average document that you're trying to read? Are the c.d.e fields meant to be treated as 3 nested arrays?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.