How can I get a notification of when my elasticsearch output fails to index an event?

We have some external data that we do not control that gets compiled and inserted into a SQL Server table. I wrote a logstash pipeline to query that table and process the data for insertion into our ElasticSearch index. This morning we got some data that was out of bounds of the data type so we hit warnings that ElasticSearch could not index the event. This is fine with us as we know the source data will eventually be fixed and reimported but we would like to be notified whenever this happens.

I've tried the slack output plugin and the email output plugin but can not get logstash to send out a notification.

input {
  jdbc {
    jdbc_driver_class => ""
    jdbc_connection_string => "jdbc:sqlserver://REDACTED;"
    jdbc_user => "REDACTED"
    jdbc_password => "REDACTED"
    statement => "SELECT * FROM videos_for_elastic_search"

mutate {
  convert => {
    // convert some fields to booleans

filter {
  // a bunch of mutations to add tags to each document

output {
  elasticsearch {
    hosts => ["REDACTED", "REDACTED"]
      index => "REDACTED"
      id => "REDACTED"
      document_id => "%{REDACTED}-%{REDACTED}-%{REDACTED}"

  if "ERROR" in [message] or "WARN" in [message] {
    slack {
      // ... slack integration

    email {
      // ... email integration

Any ideas? I'm very new to the ELK stack so maybe I'm missing something obvious.

You might be able to do it by enabling a dead letter queue on the elasticsearch output and then running another pipeline to read the queue and send alerts.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.