How to support Bulk Delete in Es-hadoop?


(Zhifeng Ma Beijing) #1

Hi All,

My use case is as follows: I want to implement a BulkDelete with the help of ES-spark library which will be called by each partition.

rdd.foreachpartition{ p=>
BulkDelete(index,type,p)
}

def BulkDelete(indexO: String, tyO: String, ids: Iterable[String]): Unit = {
....
}

Could you suggest how to use es-spark library to achieve that?

best regards,
zhifeng


(James Baiera) #2

@zhifengMaBeijing Delete is not yet supported for ES-Hadoop. There is a ticket open for tracking this as a feature here: https://github.com/elastic/elasticsearch-hadoop/issues/348#issuecomment-69394134


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.