Insert parent doc and child doc in One Spark job


(Netanel Malka) #1

Hi,
I try to insert parent doc to specific type, and child doc to another type in the same spark job.
I defined in the spark conf the es.mapping.parent to the correct field.
But when the job tries to saveJsonToEs(<INDEX/TYPE_NAME>) , I got this exception:
Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException:[JsonExtractor for field [parent]] cannot extract value from entity [class java.lang.String ]

Thank's,


(Netanel Malka) #2

I found the solution:
In the saveJsonToEs(<INDEX_NAME/TYPE_NAME>, Map("es.mapping.parent" -> <NAME_OF_PARENT_FILED> ))
Instead of, defining it in the conf object of spark.


(Costin Leau) #3

Both configurations should work - what version of ES-Hadoop are you using and how do you define the es.mapping.parent "in the conf object of spark." ?


(system) #4