How to APPEND new data in index against id using spark scala

Hi ,

I am working on spark for processing and after processing I am ingesting data into ES.But issue is that whenever I am running that code it is updating that but I want every time data should be appended.

I have used "index" option ...

val conf:SparkConf = new SparkConf().setAppName("spark session example").setMaster("local").set("es.index.auto.create","true").set("es.nodes","nodes").st("es.port","9200").set("es.http.timeout","1m").set("es.mapping.id","session_id").set("es.write.operation","index")

I also have used "upsert" option ..

val conf:SparkConf = new SparkConf().setAppName("spark session example").setMaster("local").set("es.index.auto.create","true").set("es.nodes","nodes").st("es.port","9200").set("es.http.timeout","1m").set("es.mapping.id","session_id").set("es.write.operation","upsert")

**My Requirement : **

Suppose first time , Ingested data would be like this

"_source": {
"session_id": "111",
"ga_section": [
"abc",
"abc",
"def"
]
}

Now when I run again that code ...then new data should be appended like this
((Suppose I have "abc","abc" against the same id ))

"_source": {
"session_id": "111",
"ga_section": [
"abc",
"abc",
"def",
"abc",
"abc"
]
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.