Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {

Table B has ~60millions records.
Elastic Search Hive Insert into table A query from Table B;
failed with below error.

Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed;
at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:149)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:466)
at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:450)
at org.elasticsearch.hadoop.rest.RestClient.bulk(RestClient.java:186)
at org.elasticsearch.hadoop.rest.RestRepository.tryFlush(RestRepository.java:248)
at org.elasticsearch.hadoop.rest.RestRepository.flush(RestRepository.java:270)
at org.elasticsearch.hadoop.rest.RestRepository.doWriteToIndex(RestRepository.java:224)
at org.elasticsearch.hadoop.rest.RestRepository.writeProcessedToIndex(RestRepository.java:202)
at org.elasticsearch.hadoop.hive.EsHiveOutputFormat$EsHiveRecordWriter.write(EsHiveOutputFormat.java:63)
at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:763)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841)
at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88)
at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:133)
at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:170)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555)

Elastic Search Version:
6.1.2

The error should be self describing here: Connection error (check network and/or proxy settings)- all nodes failed; means that you had a connection error to Elasticsearch and had no more nodes to fallback to. If you increase the logging level, there should be more information about why the connection failed.

Yes, it is true..
What would be efficient way to insert new rows from hive_table to elasticsearch table?
The Hive table is getting bigger & bigger with insert into elasticsearch table it is erroring out with above exception.

The first course of action that I would advise is getting to the root of why the connector is failing out. If it's a pretty straightforward problem to fix, then you've got a good path forward. If it's more complex, then it might be worth examining changing your usage patterns. I would suggest running the job with logging bumped up to DEBUG to see what falls out in the logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.