I am using elasticsearch-hadoop-hive-7.2.0.jar to use elasticsearch tables on hive.
However couldn't get it work with nested column.
CREATE EXTERNAL TABLE test2019197
(fault_details struct<faults:array<struct<Details:string, Resolution:string>>>)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.nodes'='11111.us-central1.gcp.cloud.es.io:9243',
'es.mapping.names' = 'fault_details:fault_details',
'es.net.ssl'='true',
'es.resource' = 'test2019197',
'es.nodes.wan.only' = 'true',
'es.index.read.missing.as.empty'='false');
For this particular column my es data looks like:
"fault_details" : {
"faults" : [
{
"Resolution" : "Bright",
"Details" : "Reject"
},
{
"Resolution" : "Reduce Brightness",
"Details" : "Reject"
}
]
},
I tried with multiple mapping/without map with string column but got the classcast exception which should come as the datatype mismatches on both sides.
But can some have a look over the above ; as now i am using the similar datatype as of es.
error: Failed with exception java.io.IOException:java.lang.ClassCastException: org.elasticsearch.hadoop.mr.WritableArrayWritable cannot be cast to org.apache.hadoop.io.MapWritable
TIA