hello,when I use
hadoop jar elasticsearch-yarn-2.1.1.jar -start hdfs.upload.dir=private/wcw/elasticsearchrtf
to start the elasticsearch on the CDH 5.0.0, encounter the problem as follows:
Application application_1440481474920_527812 failed 2 times due to AM Container for appattempt_1440481474920_527812_000002 exited with exitCode: -1000 due to: java.io.IOException: Resource hdfs://ns1/user/elasticsearch-yarn-2.1.1.jar is not publicly accessable and as such cannot be part of the public cache.
.Failing this attempt.. Failing the application.
note:
in previously, I add user=yarn to supergroup in order to make solve the problem as follows
Looks like there's a permission problem with the new jar. Can you double check the permissions for the mentioned file and, since you mentioned it worked before (or that's what I understood), can you please compare the working version with this one.
Thank you very much indeed, I want to install elasticsearch, cuz we have a cluster of hadoop yarn , we have a lot of data sources in the hdfs , I want to try to use elasticsearch to integrate many kinds of data info (because use hive to build tables have many limits,expecially info is affluent). At least now , I face a problem to install elasticsearch. Can you help me, Thank you
Thank you very much, May be I can explains it very clearly, my step is
first
I use hadoop fs -put to upload jar files and elasticsearch-rtf(Chinese integration version) to hdfs
I use command hadoop jar elasticsearch-yarn-2.1.1.jar -start hdfs.upload.dir=private/elasticsearchrtf to start it
firstly, I face a problem of org.apache.hadoop.security.AccessControlException: Permission denied: user=yarn,access=EXECUTE, indoe ="/user"..........group: drwxr-x--
then I change user=yarn to our supergroup
after that I try hadoop jar elasticsearch-yarn-2.1.1.jar -start hdfs.upload.dir=private/elasticsearchrtf secondly I face the error Application application_1440481474920_527812 failed 2 times due to AM Container for appattempt_1440481474920_527812_000002 exited with exitCode: -1000 due to: java.io.IOException: Resource hdfs://ns1/user/elasticsearch-yarn-2.1.1.jar is not publicly accessable and as such cannot be part of the public cache.
Try to see under what credentials and on what cluster is the yarn command executed.
Do note that one can use ES with Hadoop without the use of YARN - this option (in beta) is offered as a convenience. However it is not mandatory to use it and it is not recommended for production.
Considering the issues that you are facing, consider installing ES manually as with any environment outside YARN (puppet, chef, etc... also work).
I use Marvel , but Marvel is very slowly ,not like solr UI fruently. Maybe it due to from cloud not local computer, Can you recommend the UI Plugin like Marvel That I Can finish job in the it,Thank you
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.