How to support Bulk Delete in Es-hadoop?

Hi All,

My use case is as follows: I want to implement a BulkDelete with the help of ES-spark library which will be called by each partition.

rdd.foreachpartition{ p=>

def BulkDelete(indexO: String, tyO: String, ids: Iterable[String]): Unit = {

Could you suggest how to use es-spark library to achieve that?

best regards,

@zhifengMaBeijing Delete is not yet supported for ES-Hadoop. There is a ticket open for tracking this as a feature here:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.