I am getting the error when I execute select statement.
:org.elasticsearch.hadoop.mr.WritableArrayWritable cannot be cast to org.apache.hadoop.io.MapWritable
I tried with exclude option as well , but getting same exception. If I am not including the child columns ,I able to see the Data without any issue.
Please assist me on this issue. Thanks in advance.
I am new to this as well but i created external table like this
create external table vgsale_es(
rank int,
....
stored by "org.elasticsearch.hadoop.hive.EsStorageHandler"
tblproperties("es.resource"="vgsale_es/vgsale",
"es.index.auto.create"="true",
"es.nodes"="localhost");
Thanks Varun.
But I am using ES-SPARK for index the document and creating External table to retrieve the data. But I am indexing the Nested Json Document( Please see my document in my quesition) , not CSV file.
Hi @Srinivas2, can you share the result of the index mapping endpoint for the index on Elasticsearch you are trying to query? I'm thinking there might be a misalignment of your es.read.field.as.array.exclude setting compared to the index mapping you're trying to read.
@james.baiera Thanks James.
Below is the mapping (I have updated the document as well). I have manipulated document (removed the some of the fields )as well. However the original document looks like same here. I need to extract parent documents along with child columns so I used DOT notation in es.mapping.names .
@james.baiera : Hi James , Did you get chance look into issue. I created an index with mapping as rootcol5 as nested type , but I am getting the same error. I used es.read.filed.as.array.include also but no luck. I Followed the Official doc for these as rootcol5 contains array of Objects.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.