Error while backing up ES to HDFS via plugin

Hadoop nodes are NOT co-located with ElasticSearch and are on CDH5.1

I installed the plugin [ elasticsearch/elasticsearch-repository-hdfs/2.0.2]
on all Elasticsearch nodes, and restarted the nodes

When I try to setup the repo, I get the following error:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type":
"hdfs","uri":
"hdfs://nn-ip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path":
"elasticsearch","compress": true} }'

{"error":"RepositoryException[[my_backup] failed to create repository];
nested: CreationException[Guice creation errors:\n\n1) Error injecting
constructor, org.elasticsearch.ElasticsearchIllegalArgumentException: no
'path' defined for hdfs snapshot/restore\n at
org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown
Source)\n while locating
org.elasticsearch.repositories.hdfs.HdfsRepository\n while locating
org.elasticsearch.repositories.Repository\n\n1 error]; nested:
ElasticsearchIllegalArgumentException[no 'path' defined for hdfs
snapshot/restore]; ","status":500}

Please help me, with what additional configuration needs to be done to back
up ES indices to Hadoop which is not colocated on same nodes

Ref:https://github.com/elasticsearch/elasticsearch-hadoop/tree/master/repository-hdfs

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/0a6d17b7-0494-4624-98ce-a0a6c3860183%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I made a bit more progress, now when I run this I get this:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type":
"hdfs","uri":
"hds://nnip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path":
"elasticsearch","load_defaults": "true", "compress": true} }'

{"error":"RemoteTransportException[[node][inet[/10.3.89.102:9300]][cluster:admin/repository/put]];
nested: RepositoryException[[my_backup] failed to create repository];
nested: CreationException[Guice creation errors:\n\n1) Error injecting
constructor, java.lang.NoClassDefFoundError:
org/apache/commons/cli/ParseException\n at
org.elasticsearch.repositories.hdfs.HdfsRepository.()\n at
org.elasticsearch.repositories.hdfs.HdfsRepository\n at
Key[type=org.elasticsearch.repositories.Repository, annotation=[none]]\n\n1
error]; nested:
NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested:
ClassNotFoundException[org.apache.commons.cli.ParseException];
","status":500}

On Tuesday, February 10, 2015 at 2:58:12 PM UTC-8, Salman Ahmed wrote:

Hadoop nodes are NOT co-located with Elasticsearch and are on CDH5.1

I installed the plugin [ elasticsearch/elasticsearch-repository-hdfs/2.0.2]
on all Elasticsearch nodes, and restarted the nodes

When I try to setup the repo, I get the following error:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type":
"hdfs","uri":
"hdfs://nn-ip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path":
"elasticsearch","compress": true} }'

{"error":"RepositoryException[[my_backup] failed to create repository];
nested: CreationException[Guice creation errors:\n\n1) Error injecting
constructor, org.elasticsearch.ElasticsearchIllegalArgumentException: no
'path' defined for hdfs snapshot/restore\n at
org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown
Source)\n while locating
org.elasticsearch.repositories.hdfs.HdfsRepository\n while locating
org.elasticsearch.repositories.Repository\n\n1 error]; nested:
ElasticsearchIllegalArgumentException[no 'path' defined for hdfs
snapshot/restore]; ","status":500}

Please help me, with what additional configuration needs to be done to
back up ES indices to Hadoop which is not colocated on same nodes

Ref:
https://github.com/elasticsearch/elasticsearch-hadoop/tree/master/repository-hdfs

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/4db8c55e-be0d-4532-a0c6-fd8662b2542b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

What gives GET /_cat/plugins ?

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 11 févr. 2015 à 02:07, Salman Ahmed ahmed.salman@gmail.com a écrit :

I made a bit more progress, now when I run this I get this:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs","uri": "hds://nnip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path": "elasticsearch","load_defaults": "true", "compress": true} }'
{"error":"RemoteTransportException[[node][inet[/10.3.89.102:9300]][cluster:admin/repository/put]]; nested: RepositoryException[[my_backup] failed to create repository]; nested: CreationException[Guice creation errors:\n\n1) Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException\n at org.elasticsearch.repositories.hdfs.HdfsRepository.()\n at org.elasticsearch.repositories.hdfs.HdfsRepository\n at Key[type=org.elasticsearch.repositories.Repository, annotation=[none]]\n\n1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ","status":500}

On Tuesday, February 10, 2015 at 2:58:12 PM UTC-8, Salman Ahmed wrote:
Hadoop nodes are NOT co-located with Elasticsearch and are on CDH5.1

I installed the plugin [ elasticsearch/elasticsearch-repository-hdfs/2.0.2] on all Elasticsearch nodes, and restarted the nodes

When I try to setup the repo, I get the following error:
curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs","uri": "hdfs://nn-ip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path": "elasticsearch","compress": true} }'

{"error":"RepositoryException[[my_backup] failed to create repository]; nested: CreationException[Guice creation errors:\n\n1) Error injecting constructor, org.elasticsearch.ElasticsearchIllegalArgumentException: no 'path' defined for hdfs snapshot/restore\n at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)\n while locating org.elasticsearch.repositories.hdfs.HdfsRepository\n while locating org.elasticsearch.repositories.Repository\n\n1 error]; nested: ElasticsearchIllegalArgumentException[no 'path' defined for hdfs snapshot/restore]; ","status":500}

Please help me, with what additional configuration needs to be done to back up ES indices to Hadoop which is not colocated on same nodes

Ref:https://github.com/elasticsearch/elasticsearch-hadoop/tree/master/repository-hdfs

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/4db8c55e-be0d-4532-a0c6-fd8662b2542b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/60CACB99-C626-48DB-B440-9CE23715EDA5%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

I think I understand.

You need to wrap hdfs settings under a settings field:

$ curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{
"type": "hdfs",
"settings": {
// settings here
}
}'
HTH

David

Le 11 févr. 2015 à 06:39, David Pilato david@pilato.fr a écrit :

What gives GET /_cat/plugins ?

--
David :wink:
Twitter : @dadoonet / @elasticsearchfr / @scrutmydocs

Le 11 févr. 2015 à 02:07, Salman Ahmed ahmed.salman@gmail.com a écrit :

I made a bit more progress, now when I run this I get this:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs","uri": "hds://nnip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path": "elasticsearch","load_defaults": "true", "compress": true} }'
{"error":"RemoteTransportException[[node][inet[/10.3.89.102:9300]][cluster:admin/repository/put]]; nested: RepositoryException[[my_backup] failed to create repository]; nested: CreationException[Guice creation errors:\n\n1) Error injecting constructor, java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException\n at org.elasticsearch.repositories.hdfs.HdfsRepository.()\n at org.elasticsearch.repositories.hdfs.HdfsRepository\n at Key[type=org.elasticsearch.repositories.Repository, annotation=[none]]\n\n1 error]; nested: NoClassDefFoundError[org/apache/commons/cli/ParseException]; nested: ClassNotFoundException[org.apache.commons.cli.ParseException]; ","status":500}

On Tuesday, February 10, 2015 at 2:58:12 PM UTC-8, Salman Ahmed wrote:
Hadoop nodes are NOT co-located with Elasticsearch and are on CDH5.1

I installed the plugin [ elasticsearch/elasticsearch-repository-hdfs/2.0.2] on all Elasticsearch nodes, and restarted the nodes

When I try to setup the repo, I get the following error:
curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs","uri": "hdfs://nn-ip:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path": "elasticsearch","compress": true} }'

{"error":"RepositoryException[[my_backup] failed to create repository]; nested: CreationException[Guice creation errors:\n\n1) Error injecting constructor, org.elasticsearch.ElasticsearchIllegalArgumentException: no 'path' defined for hdfs snapshot/restore\n at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)\n while locating org.elasticsearch.repositories.hdfs.HdfsRepository\n while locating org.elasticsearch.repositories.Repository\n\n1 error]; nested: ElasticsearchIllegalArgumentException[no 'path' defined for hdfs snapshot/restore]; ","status":500}

Please help me, with what additional configuration needs to be done to back up ES indices to Hadoop which is not colocated on same nodes

Ref:https://github.com/elasticsearch/elasticsearch-hadoop/tree/master/repository-hdfs

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/4db8c55e-be0d-4532-a0c6-fd8662b2542b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/60CACB99-C626-48DB-B440-9CE23715EDA5%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/F0F088A7-0F75-4593-A427-BEEB7D82CE08%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

Thanks for the tip, now I am getting the following error:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs","settings":{"uri": "hdfs://ip01:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path": "elasticsearch","load_defaults": "true", "compress": true} }'
{"error":"RemoteTransportException[[ip03][inet[/10.3.89.103:9300]][cluster:admin/repository/put]]; nested: RepositoryException[[my_backup] failed to create repository]; nested: CreationException[Guice creation errors:\n\n1) Error injecting constructor, java.lang.NoClassDefFoundError: org/elasticsearch/common/blobstore/ImmutableBlobContainer\n at org.elasticsearch.repositories.hdfs.HdfsRepository.()\n at org.elasticsearch.repositories.hdfs.HdfsRepository\n at Key[type=org.elasticsearch.repositories.Repository, annotation=[none]]\n\n1 error]; nested: NoClassDefFoundError[org/elasticsearch/common/blobstore/ImmutableBlobContainer]; nested: ClassNotFoundException[org.elasticsearch.common.blobstore.ImmutableBlobContainer]; ","status":500}

What gives GET /_cat/plugins ?
What is you Elasticsearch version?

--
David Pilato | Technical Advocate | Elasticsearch.com
@dadoonet https://twitter.com/dadoonet | @elasticsearchfr https://twitter.com/elasticsearchfr | @scrutmydocs https://twitter.com/scrutmydocs

Le 13 févr. 2015 à 15:41, Salman ahmed.salman@gmail.com a écrit :

Thanks for the tip, now I am getting the following error:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type":
"hdfs","settings":{"uri":
"hdfs://ip01:8020/","conf_location":"/etc/conf/hadoop/conf/hdfs-site.xml,/etc/conf/hadoop/conf/core-site.xml","path":
"elasticsearch","load_defaults": "true", "compress": true} }'
{"error":"RemoteTransportException[[ip03][inet[/10.3.89.103:9300]][cluster:admin/repository/put]];
nested: RepositoryException[[my_backup] failed to create repository];
nested: CreationException[Guice creation errors:\n\n1) Error injecting
constructor, java.lang.NoClassDefFoundError:
org/elasticsearch/common/blobstore/ImmutableBlobContainer\n at
org.elasticsearch.repositories.hdfs.HdfsRepository.()\n at
org.elasticsearch.repositories.hdfs.HdfsRepository\n at
Key[type=org.elasticsearch.repositories.Repository, annotation=[none]]\n\n1
error]; nested:
NoClassDefFoundError[org/elasticsearch/common/blobstore/ImmutableBlobContainer];
nested:
ClassNotFoundException[org.elasticsearch.common.blobstore.ImmutableBlobContainer];
","status":500}

--
View this message in context: http://elasticsearch-users.115913.n3.nabble.com/Error-while-backing-up-ES-to-HDFS-via-plugin-tp4070310p4070515.html
Sent from the Elasticsearch Users mailing list archive at Nabble.com.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/1423838504957-4070515.post%40n3.nabble.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/765E437F-1708-40C7-B202-981E00D593D3%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.

I finally got it to work with the latest beta plugin. Thank you guys for your help.

I met the same problem with Salman, and, also, solved by using latest
beta version. BTW, I tested on elasticsearch-1.4.2.

On Sat, Feb 14, 2015 at 1:47 AM, Salman ahmed.salman@gmail.com wrote:

I finally got it to work with the latest beta plugin. Thank you guys for your
help.

--
View this message in context: http://elasticsearch-users.115913.n3.nabble.com/Error-while-backing-up-ES-to-HDFS-via-plugin-tp4070310p4070526.html
Sent from the Elasticsearch Users mailing list archive at Nabble.com.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/1423846039443-4070526.post%40n3.nabble.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CADQPeWwSBmZ%2BZyvjPywuS-1VXLxmt8DWRTRrcqtFQQY%2BSBuVQw%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

I am getting the following error

curl -XPUT 'http://localhost:9200/_snapshot/my_es_backup' -d '{
"type":"hdfs",
"settings": {
"location":"hdfs://nnfailover:8020",
"path":"/logs/apce_logs","conf_location":"/etc/hadoop/conf/hdfs-site.xml,/etc/hadoop/conf/core-site.xml"
}
}'
{"error":"RemoteTransportException[[omhq1947][inet[/167.132.88.251:9300]][cluster/repository/put]]; nested: RepositoryException[[my_es_backup] failed to create repository]; nested: ExecutionError[org.elasticsearch.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;]; nested: ExecutionError[java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;]; nested: NoClassDefFoundError[Lorg/apache/hadoop/fs/FileSystem;]; nested: ClassNotFoundException[org.apache.hadoop.fs.FileSystem]; ","status":500}

IS MY Option to Use the BETA version ?

Thanks

check if the class specified is found in the jar specified in the classpath?

That particular class is found in the ES_CLASSPATH which i exported

jar tf hadoop-common.jar |grep "fs/FileSystem.class"
org/apache/hadoop/fs/FileSystem.class

What i noticed is there is no L in front of org ? is that an issue ?

Thanks
logic4fun

Following is my complete class path
echo $ES_CLASSPATH
/etc/hadoop/conf:/usr/lib/hadoop/lib/:/usr/lib/hadoop/.//:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/:/usr/lib/hadoop-hdfs/.//:/usr/lib/hadoop-yarn/lib/:/usr/lib/hadoop-yarn/.//:/usr/lib/hadoop-mapreduce/lib/:/usr/lib/hadoop-mapreduce/.//::/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/lib/hadoop-mapreduce/:/usr/lib/tez/:/usr/lib/tez/lib/*:/etc/tez/conf

How was this classpath issue solved?

I have currently ran into the same issue.