From Hive to ES :EsHadoopException: Could not write all entries for bulk operation

Hi

we are trying to load data from hive to elasticsearch through es hadoop integration but we are facing some issues:
The process which we followed is :

First create an index manually in elasticsearch through curl command:

curl -X PUT "ipaddress:9200/indexname" -H 'Content-Type: application/json' -d'
{
"mappings" :{
"line" : {
"properties" : {
"Name" : {"type" : "text"},
"Details " : {"type" : "text"},
......

      }
    }
  } 

}
'
Next,we add the jar elasticsearch-hadoop-6.2.1.jar;

Create table in hive :

create external table tablename (Name string,Details string................)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler' TBLPROPERTIES ('es.resource' = 'indexname/line', 'es.index.auto.create' = 'false','es.nodes'='ipaddress:9200');

Table is created

Insert data:

insert overwrite table tablename select *from tablename;

here we got an error:

org.elasticsearch.hadoop.EsHadoopException: Could not write all entries for bulk operation [1000/1000]. Error sample (first [5] error messages):
failed to parse
failed to parse
failed to parse
failed to parse
failed to parse
Bailing out...

What is the solution for this?
Any advice will be appreciated.........

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.