Logstash pipeline in a transaction

Hi All,
I have to create two separate Elasticsearch output indices from the received data with Logstash. Till now it is a simple case, but I have to do the insertion into the indices in a transaction - if the insertion into the second index fails the whole transaction should be rolled back.
Are there any possibilities to do this?

If the failure is one that causes the event to be sent to a DLQ you could process the DLQ file to make sure each event is removed from both indexes, but there is no transactional support across outputs.

Thank!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.