Hive integration with Elasticsearch 7.2

Dear All,
After several rounds of attempts I was able to get half way through with Integrating Hive & I have been trying to get Hive (Hive 1.2.1000.2.6.0.3-8) integrated with Elasticsearch 7.2 but not yet successful. I am using HDP 2.6.5 version and need your help
I have a 3 node Elastic cluster and single node HDP all are on different nodes. If there are any corrections required in config settings or point me where am I going wrong please advise

------------ Below is ES_host_1-------------------
cluster.name: my-application
node.name: esnode1.test.com
path.data: /data1
path.logs: /var/log/elasticsearch
network.host: 192.168.1.5
http.port: 9200
discovery.seed_hosts: ["esnode1.test.com", "esnode2.test.com", "esnode3.test.com"]
cluster.initial_master_nodes: ["esnode3.test.com", "esnode2.test.com"]

------------ Below is ES_host_2 & kibana -------------------
cluster.name: my-application
node.name: esnode2.test.com
path.data: /data1
path.logs: /var/log/elasticsearch
network.host: 192.168.1.6
http.port: 9200
discovery.seed_hosts: ["esnode2.test.com", "esnode3.test.com", "esnode1.test.com"]
cluster.initial_master_nodes: ["esnode1.test.com", "esnode2.test.com", "esnode3.test.com"]
xpack.monitoring.collection.enabled: true

------------ Below is ES_host_3-------------------
cluster.name: my-application
node.name: esnode3.test.com
path.data: /data1
path.logs: /var/log/elasticsearch
network.host: 192.168.1.7
http.port: 9200
discovery.seed_hosts: ["esnode2.test.com", "esnode3.test.com", "esnode1.test.com"]
cluster.initial_master_nodes: ["esnode2.test.com", "esnode1.test.com"]

------------------ Elasticsearch Hadoop Jar file location ----------------
 
copied in location /usr/hdp/2.6.0.3-8/hive/lib/elasticsearch-hadoop-7.2.0.jar
    <property>
      <name>HIVE_AUX_JARS_PATH</name>
      <value>/usr/hdp/current/hive-server2/lib/elasticsearch-hadoop-7.2.0.jar</value>
    </property>
[root@hadoop-node1]# ll -d /usr/hdp/current/hive-server2
lrwxrwxrwx 1 root root 23 Jul 23 17:34 /usr/hdp/current/hive-server2 -> /usr/hdp/2.6.0.3-8/hive

I am able to write the data to external table in Hive but when I try to load the data into ES below is the error I get

Logging initialized using configuration in file:/etc/hive/2.6.0.3-8/0/hive-log4j.properties
OK
Time taken: 1.118 seconds
Query ID = hive_20190725220921_fd962234-e1a2-41af-89f3-b7e4cc27e86d
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1563949354091_0016, Tracking URL = http://hadoop-node.test.com:8088/proxy/application_1563949354091_0016/
Kill Command = /usr/hdp/2.6.0.3-8/hadoop/bin/hadoop job  -kill job_1563949354091_0016
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2019-07-25 22:09:28,184 Stage-1 map = 0%,  reduce = 0%
2019-07-25 22:09:41,582 Stage-1 map = 100%,  reduce = 0%
Ended Job = job_1563949354091_0016 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1563949354091_0016_m_000000 (and more) from job job_1563949354091_0016

Task with the most failures(4):
-----
Task ID:
  task_1563949354091_0016_m_000000

URL:
  http://hadoop-node.test.com:8088/taskdetails.jsp?jobid=job_1563949354091_0016&tipid=task_1563949354091_0016_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: Failed to load plan: hdfs://hadoop-node.test.com:8020/tmp/hive/hive/d6d4079b-4813-4e2f-97be-5ea3acea7efa/hive_2019-07-25_22-09-21_665_114610607148402302-1/-mr-10002/268762c6-18a3-467a-936e-7cab06dd1f1c/map.xml: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.elasticsearch.hadoop.hive.EsHiveInputFormat
Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.TableDesc)
tableInfo (org.apache.hadoop.hive.ql.plan.FileSinkDesc)
conf (org.apache.hadoop.hive.ql.exec.FileSinkOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
        at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:479)
        at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:318)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:269)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:508)
        at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:483)
        at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:715)
        at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.elasticsearch.hadoop.hive.EsHiveInputFormat

continue log details as below...

Serialization trace:
inputFileFormatClass (org.apache.hadoop.hive.ql.plan.TableDesc)
tableInfo (org.apache.hadoop.hive.ql.plan.FileSinkDesc)
conf (org.apache.hadoop.hive.ql.exec.FileSinkOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
        at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:138)
        at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:238)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:226)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:745)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:113)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:112)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:18)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
        at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
        at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
        at org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:1182)
        at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:1069)
        at org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:1083)
        at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:439)
        ... 13 more
Caused by: java.lang.ClassNotFoundException: org.elasticsearch.hadoop.hive.EsHiveInputFormat
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
        ... 49 more

Container killed by the ApplicationMaster.
Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

Hi
Can u upoad Hive Table Create Query?

Hi Pradige,
Thanks for responding
Executing as hive user

use test1;
drop table if exists test1_es;
create external table test1_es(
empid int,
empname string,
city string,
mobile_number int)
stored by 'org.elasticsearch.hadoop.hive.EsStorageHandler'
location '/elk/test1_es'     (this is an HDFS dir)
TBLPROPERTIES('es.resource'='test1/data',
                'es.nodes'='http://esnode1.test.com',
                'es.port'='9200',
                'es.index.auto.create'='true',
                'es.nodes.wan.only'='true',
use test1;
insert overwrite table test1_es
select
v.EmpID,
v.EmpName,
v.City,
v.Mobile_Number
from test1 v;