Hi,
I have been trying to create Hive Table with elastic index .Ended up with an issue.Please find the followed procedure below
Employee table from Employee database - MySQL sample imported into Hadoop via Sqoop /user/employees
CREATE TABLE IF NOT EXISTS employees_hive (emp_no int , birth_date date, first_name string ,last_name string ,gender string ,hire_date date )
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS TEXTFILE;
load data inpath '/user/employees' into table employees_hive;
I checked the employees_hive table
select * from employees_hive limit 3;
OK
10001 1953-09-02 Georgi Facello M 1986-06-26
10002 1964-06-02 Bezalel Simmel F 1985-11-21
10003 1959-12-03 Parto Bamford M 1986-08-28
Now trying to create External table with elastic index - gender
CREATE EXTERNAL TABLE IF NOT EXISTS employees_es (emp_no int , birth_date date, first_name string ,last_name string ,gender string ,hire_date date )
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'gender/emp_gender') ;
Hive Version: 3.1.0
Elastic Search Version: 6.4.2
The error is as follows:
hive> insert overwrite table employees_es select * from employees_hive;
Query ID = hduser_20181110002057_105d67af-12c5-4397-9d7b-04d478e5055d
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1541783971182_0005, Tracking URL = http://praga-VPCEH38FN:8088/proxy/application_1541783971182_0005/
Kill Command = /usr/local/hadoop/bin/mapred job -kill job_1541783971182_0005
Hadoop job information for Stage-2: number of mappers: 1; number of reducers: 0
2018-11-10 00:21:13,601 Stage-2 map = 0%, reduce = 0%
2018-11-10 00:21:42,595 Stage-2 map = 100%, reduce = 0%
Ended Job = job_1541783971182_0005 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1541783971182_0005_m_000000 (and more) from job job_1541783971182_0005
Task with the most failures(4):
Task ID:
task_1541783971182_0005_m_000000
URL:
http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1541783971182_0005&tipid=task_1541783971182_0005_m_000000
Diagnostic Messages for this Task:
Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:163)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapRunner.run(ExecMapRunner.java:37)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:465)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row
at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:973)
at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:154)
... 9 more
Caused by: org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: Cannot handle type [class org.apache.hadoop.hive.serde2.io.DateWritableV2] within type [class org.elasticsearch.hadoop.hive.HiveType], instance [1953-09-02] within instance [HiveType{object=[Ljava.lang.Object;@3b36e000, inspector=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector<org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableIntObjectInspector@614aeccc,org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDateObjectInspector@5116ac09,org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableStringObjectInspector@1bc425e7,org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableStringObjectInspector@1bc425e7,org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableStringObjectInspector@1bc425e7,org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDateObjectInspector@5116ac09>}] using writer [org.elasticsearch.hadoop.hive.HiveValueWriter@333cb916]
at org.elasticsearch.hadoop.serialization.builder.ContentBuilder.value(ContentBuilder.java:63)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.doWriteObject(TemplatedBulk.java:71)
at org.elasticsearch.hadoop.serialization.bulk.TemplatedBulk.write(TemplatedBulk.java:58)
at org.elasticsearch.hadoop.hive.EsSerDe.serialize(EsSerDe.java:165)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:951)
at org.apache.hadoop.hive.ql.exec.vector.VectorFileSinkOperator.process(VectorFileSinkOperator.java:111)
at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:966)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:939)
at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.process(VectorSelectOperator.java:158)
at org.apache.hadoop.hive.ql.exec.Operator.vectorForward(Operator.java:966)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:939)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:125)
at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:889)
... 10 more
Kindly help me