Got "java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.util.StringUtils" in repository-hdfs using Elasticsearch-Yarn

I'm using CDH 5.3.2 and elasticsearch-2.1.0. I built the the zip file of repository-hdfs and install it as a plugin. Then using elasticsearch-yarn to start the elasticsearch cluster. But when I created a HDFS repository, I got the error below. The hadoop jar where StringUtils resides elasticsearch-2.1.0/plugins/repository-hdfs/hadoop-libs/hadoop-common-2.5.0-cdh5.3.2.jar is included in the elasticsearch zip. Any idea?

[2015-12-14 15:32:27,288][INFO ][rest.suppressed          ] /_snapshot/my_backup Params: {repository=my_backup}
RemoteTransportException[[Strobe][10.150.12.108:9300][cluster:admin/repository/put]]; nested: RepositoryException[[my_backup] failed to create repository]; nested: NotSerializableExceptionWrapper[Guice creation errors:

1) Error injecting constructor, java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.util.StringUtils
  at org.elasticsearch.repositories.hdfs.HdfsRepository.<init>(Unknown Source)
  while locating org.elasticsearch.repositories.hdfs.HdfsRepository
  while locating org.elasticsearch.repositories.Repository

1 error]; nested: NotSerializableExceptionWrapper[Could not initialize class org.apache.hadoop.util.StringUtils];
Caused by: RepositoryException[[my_backup] failed to create repository]; nested: NotSerializableExceptionWrapper[Guice creation errors:

1) Error injecting constructor, java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.util.StringUtils
  at org.elasticsearch.repositories.hdfs.HdfsRepository.<init>(Unknown Source)
  while locating org.elasticsearch.repositories.hdfs.HdfsRepository
  while locating org.elasticsearch.repositories.Repository

1 error]; nested: NotSerializableExceptionWrapper[Could not initialize class org.apache.hadoop.util.StringUtils];
	at org.elasticsearch.repositories.RepositoriesService.createRepositoryHolder(RepositoriesService.java:411)
	at org.elasticsearch.repositories.RepositoriesService.registerRepository(RepositoriesService.java:368)
	at org.elasticsearch.repositories.RepositoriesService.access$100(RepositoriesService.java:55)
	at org.elasticsearch.repositories.RepositoriesService$1.execute(RepositoriesService.java:110)
	at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:388)
	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:231)
	at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:194)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: NotSerializableExceptionWrapper[Guice creation errors:

1) Error injecting constructor, java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.util.StringUtils
  at org.elasticsearch.repositories.hdfs.HdfsRepository.<init>(Unknown Source)
  while locating org.elasticsearch.repositories.hdfs.HdfsRepository
  while locating org.elasticsearch.repositories.Repository

1 error]; nested: NotSerializableExceptionWrapper[Could not initialize class org.apache.hadoop.util.StringUtils];
	at org.elasticsearch.common.inject.internal.Errors.throwCreationExceptionIfErrorsExist(Errors.java:360)
	at org.elasticsearch.common.inject.InjectorBuilder.injectDynamically(InjectorBuilder.java:178)
	at org.elasticsearch.common.inject.InjectorBuilder.build(InjectorBuilder.java:110)
	at org.elasticsearch.common.inject.InjectorImpl.createChildInjector(InjectorImpl.java:159)
	at org.elasticsearch.common.inject.ModulesBuilder.createChildInjector(ModulesBuilder.java:55)
	at org.elasticsearch.repositories.RepositoriesService.createRepositoryHolder(RepositoriesService.java:404)
	... 9 more
Caused by: NotSerializableExceptionWrapper[Could not initialize class org.apache.hadoop.util.StringUtils]
	at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:104)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:86)
	at org.apache.hadoop.security.Groups.<init>(Groups.java:66)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:280)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:269)
	at org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:297)
	at org.elasticsearch.repositories.hdfs.HdfsRepository.initFileSystem(HdfsRepository.java:122)
	at org.elasticsearch.repositories.hdfs.HdfsRepository.getFileSystem(HdfsRepository.java:99)
	at org.elasticsearch.repositories.hdfs.HdfsRepository.<init>(HdfsRepository.java:71)
        ...
	at 

Figured out why it failed. I actually ignored this warning in the log

[2015-12-15 17:50:24,207][WARN ][plugin.hadoop.hdfs       ] The Java Security Manager is enabled however Hadoop is not compatible with it and thus needs to be disabled; see the docs for more information...

When I tried to create a repository, it actually threw access denied (java.io.FilePermission <<ALL FILES>> execute). Then I ran create the repository again, I got NoClassDefFoundError.

{"error":{"root_cause":[{"type":"repository_exception","reason":"[my_backup] failed to create repository"}],"type":"repository_excep
tion","reason":"[my_backup] failed to create repository","caused_by":{"type":"creation_exception","reason":"Guice creation errors:\n\n1) Error injecting constructor, java.lang.ExceptionInInitializerError\n  at org.elasticsearch.repositories.hdfs.HdfsRepository.<init>(Unknown Source)\n  while locating org.elasticsearch.repositories.hdfs.HdfsRepository\n  while locating org.elasticsearch.repositories.Repository\n\n1 error","caused_by":{"type":"exception_in_initializer_error","reason":null,"caused_by":{"type":"access_control_exception","reason":"access denied (\"java.io.FilePermission\" \"<<ALL FILES>>\" \"execute\")"}}}},"status":500}

Hadoop requires a lot of permissions (many times without really needing them). Note that ES master has the hdfs plugin merged in without requiring the Security Manager to be disabled. Hopefully we'll be able to have this in 2.2 as well.

Is the issue resolved now?
It seemed not. I am using ES 2.3.1 with the security manager enabled. And exactly the same error msgs as the above appeared.
Then I added the following line,
permission java.io.FilePermission "*", "read,write,execute";
to the file, plugin-security.policy, under the reposity-hdfs plugin folder. Then restarted the ES. But got the same problem.
My question is: with the version 2.x, if the security manager is enabled, can I create a hdfs repository? I searched online, no any obvious answers for it.
If not, if I leave the security manager disabled, would it be a potential security hole? Is there any replacement for the security manager for ES 2.x?