Elasticsearch-hdfs snapshot failed: Server IPC version 9 cannot communicate with client version 4

Setup:

elasticsearch 2.1.1
hadoop 2.2.0

I have been trying to sapshot elasticsearch to hdfs repository. But it fails.
Following this document.

This call fails:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '
{"type": "hdfs",
"settings":{
"uri": "hdfs://127.0.0.1:9000/",
"conf_location":"pathTo/hadoop-2.2.0/etc/hadoop/hdfs-site.xml,pathTo/hadoop-2.2.0/etc/hadoop/core-site.xml",
"path": "elasticsearch",
"load_defaults": "true",
"compress": true}
}'

with error:

{"error":{"root_cause":[{"type":"repository_exception","reason":"[my_backup] failed to create repository"}],"type":"repository_exception","reason":"[my_backup] failed to create repository","caused_by":{"type":"creation_exception","reason":"Guice creation errors:\n\n1) Error injecting constructor, ElasticsearchGenerationException[Cannot create Hdfs file-system for uri [hdfs://127.0.0.1:9000/]]; nested: RemoteException[Server IPC version 9 cannot communicate with client version 4];\n at org.elasticsearch.repositories.hdfs.HdfsRepository.(Unknown Source)\n while locating org.elasticsearch.repositories.hdfs.HdfsRepository\n while locating org.elasticsearch.repositories.Repository\n\n1 error","caused_by":{"type":"generation_exception","reason":"Cannot create Hdfs file-system for uri [hdfs://127.0.0.1:9000/]","caused_by":{"type":"remote_exception","reason":"Server IPC version 9 cannot communicate with client version 4"}}}},"status":500}

The content of hadoop-libs directory under plugin repository-hdfs looks like this:

commons-codec-1.4.jar commons-lang-2.4.jar jackson-mapper-asl-1.8.8.jar
commons-collections-3.2.1.jar commons-logging-1.1.1.jar jets3t-0.6.1.jar
commons-configuration-1.6.jar commons-math-2.1.jar oro-2.0.8.jar
commons-digester-1.8.jar commons-net-1.4.1.jar xmlenc-0.52.jar
commons-httpclient-3.0.1.jar hadoop-core-1.2.1.jar
commons-io-2.1.jar jackson-core-asl-1.8.8.jar

I see plugins being loaded while elasticsearch startup

[INFO ][plugin.hadoop.hdfs ] Loaded Hadoop [1.2.1] libraries from file:/usr/share/elasticsearch/plugins/repository-hdfs/

What is the issue causing it to fail to snapshot?
Any help will be appreciated.

The exception appears since you are not using the correct Hadoop version within the hdfs plugin as explained in the docs. You are basically using the plugin with Hadoop 1.2.1 inside against a Hadoop 2.x - upgrading to Hadoop 2 or using the same libraries fixes this.

Thanks Costin,

I tried upgrading ES to 2.2.0 and trid the call:

curl -XPUT 'http://localhost:9200/_snapshot/my_backup' -d '{"type": "hdfs",
"settings":{
"uri":
"hdfs://127.0.0.1:9000/",
"conf_location":"~/hadoop-2.2.0/etc/hadoop/hdfs-site.xml,~/hadoop-2.2.0/etc/hadoop/core-site.xml",
"path": "elasticsearch",
"load_defaults": "true",
"compress": true}

}'

It gives the same error.

Another thing is, the plugin that I downloaded by bin/plugin install elasticsearch/elasticsearch-repository-hdfs/2.2.0 , I get hadoop-core-1.2.1.jar under hadoop-libs. I tried it replcing with hadoop-common-2.2.0 and other 2.2.0 auth jars ..but it failed.

Which plugin I should download to get it working? Which version of hadoop and ES?

Thanks

@costin ,

Any suggestions on this?

Have you read the docs I mentioned? This is fair question since it contains a section on the various flavours of the plugin including Hadoop 1 and Hadoop 2:

Yarn / Hadoop 2.x
The hadoop2 version contains the plugin jar plus the Hadoop 2.x (Yarn) dependencies.