How to handle events rejected by kafka output plugin

I recently encountered a situation in which my Kafka broker was unable to accept an event from Logstash (the event was too large). The default configuration of the Kafka output plugin is to retry forever. Unfortunately, this resulted in an extended outage during which Logstash was unable to process any events. To avoid this in the future, I've added a filter which truncates the message field in the event, and also added a retry limit of 10 to the Kafka output plugin. This means that in the future, events which cannot be sent to Kafka will simply be discarded. While this is preferable to retrying forever and loosing a large quantity of incoming events, I would still like to be able to recover the events which were rejected by Kafka. So the question is:

Is there any way to configure Logstash to send an event to one output plugin if and only if another fails?

Ideally, my output configuration would look something like this:

output {
  kafka {
    # configuration details
  }
  if kafka plugin failed {
    file {
      # configuration details
    }
  }
}

You're looking for dead letter queue support in the Kafka output.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.