java.net.ConnectException when trying to overwrite a table in Hive?

I'm already having a table(test) in hive, with a single column (logtype) which has around 5 rows.

And I created another table(hive_to_elastic) with ES properties as such:

CREATE EXTERNAL TABLE hive_to_elastic(logtype string)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource'='my_index/my_type','es.nodes'='10.2.2.47','es.port'='9200');

Now I'm simply trying to copy the data from test to hive_to_elastic by executing the below query:

INSERT OVERWRITE TABLE hive_to_elastic SELECT * FROM test;

But what I can see is this:

enter image description here

And it's frozen after that point. I couldn't see the no of docs count increasing in my ES index.After a long time since I've started executed the query, I get this exception:

enter image description here

Where am I going wrong? Any help could be appreciated.

It seems that hive cannot reach hadoop to launch the job. The stacktrace seems to highlight that the connection is being refused in the hadoop rpc layer.

@james.baiera Thank you so much for the reply.[quote="james.baiera, post:2, topic:90952"]
It seems that hive cannot reach hadoop to launch the job
[/quote]

What could be the initiatives I could take, in order to resolve this? Could this be due to the network restrictions?

Do I have to add anything within the hadoop or hive configurations? I could provide the configs if needed.

Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.