[Spark] saveToEs issue


#1

Hi,

Considering the following example :

JavaSparkContext jsc = new JavaSparkContext(sparkConf);                             

Map<String, ?> numbers = ImmutableMap.of("one", 1, "two", 2);                   
Map<String, ?> airports = ImmutableMap.of("OTP", "Otopeni", "SFO", "San Fran");

//RDD init >>
JavaRDD<Map<String, ?>> javaRDD = jsc.parallelize(ImmutableList.of(numbers, airports));

I noticed that the following line throws an exception "Cluster state volatile; cannot find node backing shards - please check whether your cluster is stable" :

JavaEsSpark.saveToEs(javaRDD, "testindex/testdoc");

But when I do this way (dynamic type) :

Map<String, ?> numbers = ImmutableMap.of("doc","dummy","one", 1, "two", 2);                   
Map<String, ?> airports = ImmutableMap.of("doc","dummy","OTP", "Otopeni", "SFO", "San Fran");

JavaEsSpark.saveToEs(javaRDD, "testindex/{doc}");

... it works !

I don't understand what is going on ?

Thanks for helping me :slight_smile:


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.