Update to elasticsearch fails when lines are parsed one after another

Hi there,
I'm using logstash to send parsed log data to ES. If some field is the same as the one of the previous line, only a few fields are kept and logstash should update the document that was created by the previous line.

The problem is as I understand that logstash sends bulk requests, and so each time I get document not found exception, becaus the previous line's documnet hasn't been created yet.

Does anybody know some kind of workaround for this?

Thank you,

Hi Aviv,

Take a look at the [elasticsearch filter plugin] (https://www.elastic.co/guide/en/logstash/current/plugins-filters-elasticsearch.html), it might help you.