Python/Elasticsearch Bulk problem

Hi all,

I would like to perform bulk request of next action. I want to create document if it does not exist, but I would like to update it, if it does exist. Within the update, I append new values (new_a and new_b) to arrays which are inside a single doc (a and b). For a single document, next part of code in Python seems to work flawlessly, when I invoke update function with following body:

{"script":{"inline": "ctx._source.a.addAll(params.new_a); ctx._source.b.addAll(params.new_b)",
           "params": {"a": new_a, "b": new_b}},
 "upsert": {"a": new_a, "b": new_b}}

However, when I try to take the present logic and expand it into Bulk request, I kind of hit the wall. Thank you in advance for help

EDIT: Preview purposes only


can you maybe explain in more detail, what exactly fails and how your code looks like? are you calling the bulk endpoint yourself are you using the bulk helper?



Hi Alex,

so what I do is I loop over my data in a python loop and I yield every map (dictionary) that I need to update. What I send to helpers.bulk() function from a Python Elasticsearch library is next scheme

{"_index":  index_name,
 "op_type": "update",
 "_type": doc_type
 "_id": doc_id
 "_source": {"source": updateSchema}}

in which updateSchema is a map from my original post. The scripts executes, but rather then storing results from scripts, it stores them as literal strings. Additionally, I tried the variant with body explained on next link, but again, scripts are not being executed but stored as strings.


Figured it out, op_type has to be sent with a starting "_" to be interpreted properly

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.