Hive ES Hadoop not finding ES cluster

hello all,
Elasticsearch cluster is different from hadoiop cluster and connectivity is tested between the data nodes and elastic search
I am able to create a collection in ElasticSearch but I am not able to run a MapReduce job on the hadoop cluster which will index the data into elastic search.

I created a Hive table as
CREATE EXTERNAL TABLE es_datacatalogsearchindex (
dataset_id STRING,
search_string STRING)
STORED BY 'org.elasticsearch.hadoop.hive.EsStorageHandler'
TBLPROPERTIES('es.resource' = 'datacatalogsearchindex/concatenatedRecord', 'es.nodes.wan.only' = 'true','es.node' = 'real IP address');

insert into es_datacatalogsearchindex select * from concatenated_index limit 100;

Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:248)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:570)

at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:514)

... 13 more

Caused by: org.elasticsearch.hadoop.EsHadoopIllegalArgumentException: Cannot detect ES version - typically this happens if the network/Elasticsearch cluster is not accessible or when targeting a WAN/Cloud instance without the proper setting 'es.nodes.wan.only'

at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:196)

at org.elasticsearch.hadoop.hive.HiveUtils.init(HiveUtils.java:142)

at org.elasticsearch.hadoop.hive.EsHiveOutputFormat.getHiveRecordWriter(EsHiveOutputFormat.java:93)

at org.elasticsearch.hadoop.hive.EsHiveOutputFormat.getHiveRecordWriter(EsHiveOutputFormat.java:42)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:260)

at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:245)

... 15 more

Caused by: org.elasticsearch.hadoop.rest.EsHadoopNoNodesLeftException: Connection error (check network and/or proxy settings)- all nodes failed; tried [[127.0.0.1:9200]]

at org.elasticsearch.hadoop.rest.NetworkClient.execute(NetworkClient.java:142)

at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:434)

at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:414)

at org.elasticsearch.hadoop.rest.RestClient.execute(RestClient.java:418)

at org.elasticsearch.hadoop.rest.RestClient.get(RestClient.java:122)

at org.elasticsearch.hadoop.rest.RestClient.esVersion(RestClient.java:564)

at org.elasticsearch.hadoop.rest.InitializationUtils.discoverEsVersion(InitializationUtils.java:184)

... 20 more

This seems like the job doesn't have es.nodes set to anything, and is therefor defaulting to localhost.[quote="darshanpandya, post:1, topic:60181"]
'es.node' = 'real IP address'
[/quote]

Change this setting to es.nodes instead of es.node. Hope that helps :slight_smile:

Thanks James!