It works when I set the role to "superuser" but i am getting below error when I try to create custom role using kibana and then try to use it in role-mapping.yml
Config: Error 403 Forbidden: [security_exception] action [indices:data/write/update] is unauthorized for user [XXXXX_User]
Your question provided lots of details about the things that are working, and none of the details about the things that aren't working. It is very hard to provide useful advice when the only concrete piece of information we have is that some user with some role isn't able to do update some index.
When asking questions, focus your details on the parts that aren't working:
When I do this, the user gets this error.
The user has this role.
The role has this definition.
Here's what you can do to debug the problem:
Use the authenticate API to check your users roles, and then the roles API to check the definition of those roles.
That will show you which roles your user has. Check that your custom role is there.
Then run this, and copy-and-paste the role name from the output of the authenticate API. About 50% of these problems are caused by typos, so make sure you directly copy the role name.
I tried GET /_xpack/security/role/kibana?pretty but I got {} as output - kibana is the role which I created through management -> security under Kibana application. For other users like superuser I am getting following output:
Could you please tell me why I am not able get response for the role which I created? does it mean - we can assign only system defined roles for LDAP configuration?
however, If I create the role using management -> security It doesn't work when I configure under role-mapping.yml. Meaning, I am getting the unauthorized for user [XXXXX_User] error.
I am facing a new issue now - If I change the indices name to something else like my index name - authorization fails again - I am getting the unauthorized for user [XXXXX_User] error.
It works only if I give "*" - can you help me to understand this?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.