Failed with exception java.io.IOException:java.lang.ClassCastException: org.elasticsearch.hadoop.mr.WritableArrayWritable cannot be cast to org.apache.hadoop.io.Text

Continuing the discussion from ElasticSearch 2.1.2 JAR for Hadoop Integration is not working:

CREATE EXTERNAL TABLE TestTable(key STRING,msgtype STRING,app_log_ts STRING) STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'logstash-*/log', 'es.nodes' = 'es.host', 'es.port' = '9200','es.mapping.names' = 'key:CLIENT_ORDER_ID,msgtype:MESSAGE_TYPE,app_log_ts:APP_TIME');

Whilke executing select I see exception :
select * from TestTable;
Failed with exception java.io.IOException:java.lang.ClassCastException: org.elasticsearch.hadoop.mr.WritableArrayWritable cannot be cast to org.apache.hadoop.io.Text

NOTE : Its Prints first few the rows, then throws the exception and exit.
I See ES has text for the all the field.

jar :
elasticsearch-hadoop-2.2.0-m1.jar

  1. use the latest GA jar (currently 2.2.0) not a milestone.
  2. there's a type mismatch in your mapping, namely a field is mapped as an array while your hive mapping wants a single-valued String. More about it here.

Hi Costin,
Attribute "es.read.field.as.array.exclude" helped to resolve.
Thanks for your quick response !!