Select * works but select count(*) fails. please help

works fine: select * from hivetbl_betateamcity_webhook_payloads
fails: select count(1) from hivetbl_betateamcity_webhook_payloads
fails: select count(*) from hivetbl_betateamcity_webhook_payloads

External table definition
CREATE EXTERNAL TABLE HiveTbl_betateamcity_webhook_payloads (
datetimestamp timestamp,
version string,
Buildlog_url string,
Teamcity_buildid string,
Teamcity_url string,
build_status string,
headers_request_uri string,
message string)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES
(
'es.nodes' = '10.212.46.2',
'es.index.auto.create' = 'false',
'es.resource' = 'betateamcity_webhook_payloads',
'es.query' = '?q=*',
'es.mapping.names' = 'datetimestamp:@timestamp , version:@version, Buildlog_url:Buildlog_url, Teamcity_buildid:Teamcity_buildid, Teamcity_url:Teamcity_url, build_status:build.status, headers_request_uri: headers.request_uri, message:message '
);

Hive error:
17/03/22 16:38:38 ERROR status.SparkJobMonitor: Status: Failed
17/03/22 16:38:38 INFO log.PerfLogger: </PERFLOG method=SparkRunJob start=1490225917349 end=1490225918350 duration=1001 from=org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor>
17/03/22 16:38:38 ERROR ql.Driver: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
17/03/22 16:38:38 INFO log.PerfLogger: </PERFLOG method=Driver.execute start=1490225917275 end=1490225918354 duration=1079 from=org.apache.hadoop.hive.ql.Driver>
17/03/22 16:38:38 INFO ql.Driver: Completed executing command(queryId=hive_20170322163838_05bab898-7bf3-4523-83ef-87376c67af0e); Time taken: 1.079 seconds
17/03/22 16:38:38 INFO log.PerfLogger:
17/03/22 16:38:38 INFO log.PerfLogger: </PERFLOG method=releaseLocks start=1490225918354 end=1490225918376 duration=22 from=org.apache.hadoop.hive.ql.Driver>
17/03/22 16:38:38 ERROR operation.Operation: Error running hive query:
org.apache.hive.service.cli.HiveSQLException: Error while processing statement: FAILED: Execution Error, return code 3 from org.apache.hadoop.hive.ql.exec.spark.SparkTask
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:374)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:180)
at org.apache.hive.service.cli.operation.SQLOperation.access$100(SQLOperation.java:72)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:232)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:245)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

Note: here my hive.execution.engine=spark; but we are getting this error with mr as well.
Note: Logstash is writing to the same IP address and not localhost.

please help, kind of urgent!

Could you include the logs from the failed tasks? Generally Hive does not send back the actual error that occurred on the workers.

thanks for yor response James. We are using cloudera(CDH). where in CDH can i get the actual hive task logs? please advise!

@dixitnikhil2004 it depends on which UI you are using, but when Hive stands up a job it is either executing the job locally (just running over an open stream of data for something like a simple select) or on the execution nodes (for things like joins and groupings). Depending on which framework you are using to execute Hive against, I would search for where the logs for the executors live.

Hi James,

i ran the command to query ES external hive table from hive comand line and here's the error

[builder@monkey ~]$ hive -e 'Select * from hivetbl_betateamcity_webhook_payloads where build_status = "success"'
2017-03-27 16:36:15,604 WARN [main] mapreduce.TableMapReduceUtil: The hbase-prefix-tree module jar containing PrefixTreeCodec is not present. Continuing without it.

Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-5.7.0-1.cdh5.7.0.p0.45/jars/hive-common-1.1.0-cdh5.7.0.jar!/hive-log4j.properties
FAILED: ParseException line 1:81 character '' not supported here

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.