Exception while inserting data in to elastic search using apache spark (org.apache.spark.SparkException: Task not serializable)


(Basim) #1

I am using custom spark receiver for spark streaming. In javaCustomStream i am getting value in JavaDStream But when i try to save data into elasticsearch by using saveJsonToEs function it gives exception Apache spark exception org.apache.spark.SparkException: Task not serializable
at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298)

    spark.getCustomReceiverStream().foreachRDD(rdd -> {
            rdd.foreachPartition (partition -> {
                partition.forEachRemaining(row -> {
                    LOG.debug("printing here : " + row.toString());  
                
				// Now when i try to perform some action on row it gives exception
				
                });
            });
        });

(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.