We just insert into an external table from existing one and it hangs on kill command:
hive> INSERT OVERWRITE TABLE user01 SELECT id,name FROM user_source;
Query ID = csi_20160323182829_c33296cf-9bd6-4809-8d0e-acb521dd1fcc
Total jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1458728756355_0001, Tracking URL = http://server1:8088/proxy/application_1458728756355_0001/
Kill Command = /home/csi/hadoop/bin/hadoop job -kill job_1458728756355_0001
some errors from hive logs as follows:
2016-03-24 10:31:17,614 INFO [main]: exec.Utilities (Utilities.java:getBaseWork(456)) - Fil e not found: File does not exist: /tmp/hive/csi/9aeb0c93-0e42-4907-80bd-6c46c773e2cc/hive_20 16-03-24_10-31-14_052_2790586764771909959-1/-mr-10002/476d8ea7-00d0-4248-8995-dcbbd9f773e4/r educe.xml
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:71)
at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesy stem.java:1828)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesyste m.java:1799)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesyste m.java:1712)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNo deRpcServer.java:587)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.ge tBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.