below is the action I want to take:
SparkConf rconf = new SparkConf()
SparkConf wconf = new SparkConf()
JavaSparkContext rjsc = new JavaSparkContext(rconf);
JavaSparkContext wjsc = new JavaSparkContext(wconf);
JavaRDD<Map<String, Object>> esRDD = JavaEsSpark.esRDD(rjsc, "").values();
JavaRDD<Map<String, Object>> esRDD2 = wjsc.union(esRDD);
JavaEsSpark.saveToEs(esRDD2, "xx");
but spark won't support 2 context per application.
so I'd like to know is there a way to do this easily?
You could use a single context and provide the settings just at the RDD level instead of at the context level. We don't really test in scenarios where there are multiple clusters. If you run into any issues feel free to bring them up here and we'll try to help you work through them.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.