Issues with using repository-hdfs plug in for snapshot/restore operation

I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server is
installed on the same vm.
I have one successful senario: I used light version and add the result and
command hadoop classpath to ES_CLASSPATH

But I encoutered errros with the default version and hadoop2 version.
Here is the details of issues.
#1. I installed the plugin with this command
bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0
and I sent a PUT request below:
url: http://localhost:9200/_snapshot/hdfs_repo
data :{
"type":"hdfs",
"settings":
{
"uri":"hdfs://localhost.localdomain:8020",
"path":"/user/cloudera/es_snapshot"
}
}

I got this response

  1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:
  1. Error injecting constructor, org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]]; nested: RemoteException[Server IPC version 7 cannot communicate with client version 4]; ",
2. "status": 500

I noticed RemoteException: Server IPC version 7 cannot communicate with
client version 4

#2 Then I tried hadoop2 version,. So I installed plugin with this command
bin/plugin --install
elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2

I sent a PUT request as above, this time I even got more strange exectiopon

NoClassDefFoundError[org/apache/commons/cli/ParseException]
Here is the response.

{
"error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:

  1. Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ",
"status": 500
}

I wonder if any one have simiar experiences. Note the failed cases are actaully
more realistic deplyoment choices. Because my hadoop cluster will less likely be on the same node
as my ES server.
Thanks,
Jack

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/acb15aff-299b-4e4b-bf20-b0ed5a891f60%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hi Jinyuan Zhou,

I am also having the same issue...

On Tuesday, July 8, 2014 3:12:04 AM UTC+5:30, Jinyuan Zhou wrote:

I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server
is installed on the same vm.
I have one successful senario: I used light version and add the result and
command hadoop classpath to ES_CLASSPATH

But I encoutered errros with the default version and hadoop2 version.
Here is the details of issues.
#1. I installed the plugin with this command
bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0
and I sent a PUT request below:
url: http://localhost:9200/_snapshot/hdfs_repo
data :{
"type":"hdfs",
"settings":
{
"uri":"hdfs://localhost.localdomain:8020",
"path":"/user/cloudera/es_snapshot"
}
}

I got this response

  1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:
  1. Error injecting constructor, org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]]; nested: RemoteException[Server IPC version 7 cannot communicate with client version 4]; ",
2. "status": 500

I noticed RemoteException: Server IPC version 7 cannot communicate with
client version 4

#2 Then I tried hadoop2 version,. So I installed plugin with this command
bin/plugin --install
elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2

I sent a PUT request as above, this time I even got more strange
exectiopon

NoClassDefFoundError[org/apache/commons/cli/ParseException]
Here is the response.

{
"error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:

  1. Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ",
"status": 500
}

I wonder if any one have simiar experiences. Note the failed cases are actaully
more realistic deplyoment choices. Because my hadoop cluster will less likely be on the same node
as my ES server.
Thanks,
Jack

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/2295d772-931a-4662-a3ea-509fcbac89a7%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

We had the same issue. We're also running CDH 4.6, which expects a
different hadoop client.

We fixed these by grabbing the source, removing the 'exclude module:
"commons-cli"' from repository-hdfs/build.gradle, setting 'hadoop2Version
= 2.0.0-cdh4.6.0' in gradle.properties (and set the esVersion and
luceneVersions for good measure), and building our own zip:
cd repository-hdfs/; ../gradlew -Pdistro=hadoopYarn distZip

You can install this with:
bin/plugin -u
file:////elasticsearch-repository-hdfs-2.1.0.BUILD-SNAPSHOT-hadoop2.zip
-i elasticsearch-repository-hdfs

Hope this helps.

-brent

On Monday, July 7, 2014 3:42:04 PM UTC-6, Jinyuan Zhou wrote:

I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server
is installed on the same vm.
I have one successful senario: I used light version and add the result and
command hadoop classpath to ES_CLASSPATH

But I encoutered errros with the default version and hadoop2 version.
Here is the details of issues.
#1. I installed the plugin with this command
bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0
and I sent a PUT request below:
url: http://localhost:9200/_snapshot/hdfs_repo
data :{
"type":"hdfs",
"settings":
{
"uri":"hdfs://localhost.localdomain:8020",
"path":"/user/cloudera/es_snapshot"
}
}

I got this response

  1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:
  1. Error injecting constructor, org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]]; nested: RemoteException[Server IPC version 7 cannot communicate with client version 4]; ",
2. "status": 500

I noticed RemoteException: Server IPC version 7 cannot communicate with
client version 4

#2 Then I tried hadoop2 version,. So I installed plugin with this command
bin/plugin --install
elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2

I sent a PUT request as above, this time I even got more strange
exectiopon

NoClassDefFoundError[org/apache/commons/cli/ParseException]
Here is the response.

{
"error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:

  1. Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ",
"status": 500
}

I wonder if any one have simiar experiences. Note the failed cases are actaully
more realistic deplyoment choices. Because my hadoop cluster will less likely be on the same node
as my ES server.
Thanks,
Jack

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/259f35c0-32f1-45fe-8b3c-0e4f69c30ca5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I had a similar scenario, running CDH 4.6, unable to initialise the
repository with hadoop2 version and started changing gradle as Brent
suggests. However, maintaining our own version was a bit too much in our
case.

As Costin pointed to me here
https://groups.google.com/forum/?fromgroups=#!topic/elasticsearch/613YHEUAtuA,
the lightweight jar is there for the plethora of distributions, as long as
you have your hadoop jars on the classpath.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/dffaaab2-8e66-4574-a755-a1577a6343bc%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Hey Jinyuan,

I'm having the same issue and want to solve with the light version.
How are you doing this part hadoop classpath to ES_CLASSPATH?

I tried to exec `hadoop class' path and added the result to "export
ES_CLASSPATH=" but getting same error, is that what you ment?

Thank you,

On Monday, July 7, 2014 2:42:04 PM UTC-7, Jinyuan Zhou wrote:

I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server
is installed on the same vm.
I have one successful senario: I used light version and add the result and
command hadoop classpath to ES_CLASSPATH

But I encoutered errros with the default version and hadoop2 version.
Here is the details of issues.
#1. I installed the plugin with this command
bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0
and I sent a PUT request below:
url: http://localhost:9200/_snapshot/hdfs_repo
data :{
"type":"hdfs",
"settings":
{
"uri":"hdfs://localhost.localdomain:8020",
"path":"/user/cloudera/es_snapshot"
}
}

I got this response

  1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:
  1. Error injecting constructor, org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]]; nested: RemoteException[Server IPC version 7 cannot communicate with client version 4]; ",
2. "status": 500

I noticed RemoteException: Server IPC version 7 cannot communicate with
client version 4

#2 Then I tried hadoop2 version,. So I installed plugin with this command
bin/plugin --install
elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2

I sent a PUT request as above, this time I even got more strange
exectiopon

NoClassDefFoundError[org/apache/commons/cli/ParseException]
Here is the response.

{
"error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:

  1. Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ",
"status": 500
}

I wonder if any one have simiar experiences. Note the failed cases are actaully
more realistic deplyoment choices. Because my hadoop cluster will less likely be on the same node
as my ES server.
Thanks,
Jack

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/e1f8dbcc-1c2e-46bc-8176-a40a6a5f0692%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Yes. the output of "hadoop classpath" should give you a list of jars that
a Hadoop client needed to talk to hadoop cluster. as long as you put these
in Elasticsearch's classpath, it should eliminate many common issues like
wrong version of jar get loaded or a wrong hadoop client version is used to
talk to cluster make. Make sure you follow classpath's format when you
append output to ES_CLASSPATH

On Friday, November 21, 2014 3:00:11 PM UTC-8, Daniel Gligorov wrote:

Hey Jinyuan,

I'm having the same issue and want to solve with the light version.
How are you doing this part hadoop classpath to ES_CLASSPATH?

I tried to exec `hadoop class' path and added the result to "export
ES_CLASSPATH=" but getting same error, is that what you ment?

Thank you,

On Monday, July 7, 2014 2:42:04 PM UTC-7, Jinyuan Zhou wrote:

I am using elasticsearch 1.2.1 and CDH 4.6. quick start vm. My ES server
is installed on the same vm.
I have one successful senario: I used light version and add the result
and command hadoop classpath to ES_CLASSPATH

But I encoutered errros with the default version and hadoop2 version.
Here is the details of issues.
#1. I installed the plugin with this command
bin/plugin --install elasticsearch/elasticsearch-repository-hdfs/2.0.0
and I sent a PUT request below:
url: http://localhost:9200/_snapshot/hdfs_repo
data :{
"type":"hdfs",
"settings":
{
"uri":"hdfs://localhost.localdomain:8020",
"path":"/user/cloudera/es_snapshot"
}
}

I got this response

  1. "error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:
  1. Error injecting constructor, org.elasticsearch.ElasticsearchGenerationException: Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://localhost.localdomain:8020]]; nested: RemoteException[Server IPC version 7 cannot communicate with client version 4]; ",
2. "status": 500

I noticed RemoteException: Server IPC version 7 cannot communicate with
client version 4

#2 Then I tried hadoop2 version,. So I installed plugin with this command
bin/plugin --install
elasticsearch/elasticsearch-repository-hdfs/2.0.0-hadoop2

I sent a PUT request as above, this time I even got more strange
exectiopon

NoClassDefFoundError[org/apache/commons/cli/ParseException]
Here is the response.

{
"error": "RepositoryException[[hdfs_repo] failed to create repository]; nested: CreationException[Guice creation errors:

  1. Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)
    while locating org.elasticsearch.repositories.hdfs.HdfsRepository
    while locating org.elasticsearch.repositories.Repository

1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ",
"status": 500
}

I wonder if any one have simiar experiences. Note the failed cases are actaully
more realistic deplyoment choices. Because my hadoop cluster will less likely be on the same node
as my ES server.
Thanks,
Jack

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/e3bafea8-5fcc-4801-8c43-90056c7c05a4%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.