Hi,
My requirement is to index a JSONArray as one of the field value into elstic search using spark...I am trying to do the following, but no luck..
val resMap = scala.collection.mutable.Map[String,JSONArray]()
for(....){
resMap += ("tagName"->k,"objectids"->v) ( Add few rows here)
}
val saveRdd = sc.parallelize(Seq(resMap))
val esconf = Map("es.nodes" -> getConf(ELASTICSEARCH).split(":")(0), "es.port" -> "9200","es.index.auto.create" -> "true", "es.mapping.id" -> "tagName")
EsSpark.saveToEs(saveRdd,customerId+"test/objectTags",esconf)
But i am getting following exception..
org.elasticsearch.hadoop.serialization.EsHadoopSerializationException: Cannot handle type [class org.codehaus.jettison.json.JSONArray] within type [class scala.collection.mutable.HashMap]
Any best way to index this type of data. We are using spark 1.6.2 and elasticearch 2.3.4
Thanks,
Malini