Hello,
I want to use hadoop to save snapshot. I configured elasticsearch in this form:
1 - Added security.manager.enabled: false in elasticsearch.yml
2 - Configured the repository propertiesin elasticsearch.yml as:
repositories.hdfs.uri: " hdfs://cloudera-22:8020/"
repositories.hdfs.path: "home/ejmanager/elastic-search-test"
repositories.hdfs.load_defaults: "true"
repositories.hdfs.concurrent_streams: 5
repositories.hdfs.compress: "false"
repositories.hdfs.chunk_size: "10mb"
3 - I installed the plugin with:
bin/plugin install elasticsearch-repository-hdfs-2.4.0.BUILD-20160418.041409-10-hadoop2.zip
4 - Restarted elasticsearch
5 - Executed the next RESTFUL service of Elasticsearch:
PUT _snapshot/my_hdfs_repository
{
"type": "hdfs",
"settings": {
"uri": "hdfs://cloudera-22:8020/",
"path": "home/ejmanager/elastic-search-test",
"conf.dfs.client.read.shortcircuit": "true"
}
}
And I have the next error:
{
"error": {
"root_cause": [
{
"type": "repository_exception",
"reason": "[my_hdfs_repository] failed to create repository"
}
],
"type": "repository_exception",
"reason": "[my_hdfs_repository] failed to create repository",
"caused_by": {
"type": "creation_exception",
"reason": "Guice creation errors:\n\n1) Error injecting constructor, org.apache.hadoop.security.AccessControlException: Permission denied: user=weblogic, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:216)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.
"caused_by": {
"type": "access_control_exception", "reason": "Permission denied: user=weblogic, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:216)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:145)
"caused_by": {
"type": "remote_exception",
"reason": "Permission denied: user=weblogic, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:257)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:238)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:216)\n\tat org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:145)\n\tat
}
}
}
},
"status": 500
}
I can read the error it is relationshiped with permission to create de repository but i do not understand how i can repair it.
THANKS!!