Wildcard suffix for indicies, aggregate for each index

Hi,

I have this watcher that checks for errors and sends an e-mail if any error is found. It's setup so that it checks a wildcard index patterns ("services-*"). There are too many services so a wildcard is a must since we do not want to manage multiple watchers.

Anyway, the watcher sends an e-mail with the number of errors and an excerpt from a stacktrace once it's identified one or more errors.

This has served us well but I've started to think about how to modify the watcher in a way so that the e-mail body would include a list with each index and the numbers of errors, e.g.

18 errors found.
  * 9: service-name-1 (underlying index pattern would work)
  * 6: service-name-2
  * 3: service-name-42

Is that doable?

And for reference this is the current watcher in question:

{
  "trigger": {
    "schedule": {
      "interval": "1m"
    }
  },
  "throttle_period" : "1m",
  "input": {
    "search": {
      "request": {
        "search_type": "query_then_fetch",
        "indices": [
          "services-*"
          //more wildcards here
        ],
        "rest_total_hits_as_int": true,
        "body": {
          "query": {
            "bool": {
              "must": [
                {
                  "query_string": {
                    "query": "level:Error"
                  }
                },
                {
                  "range": {
                    "@timestamp": {
                      "gte": "now-1m"
                    }
                  }
                }
              ]
            }
          },
          "_source": [
            "message"
          ],
          "sort": [
            {
              "@timestamp": {
                "order": "desc"
              }
            }
          ]
        }
      }
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total": {
        "gt": 0
      }
    }
  },
  "actions": {
    "send_email": {
      "email": {
        "profile": "standard",
        "to": [
          "account@hostname"
        ],
        "subject": "Log watcher [{{ctx.metadata.name}}]",
        "body": {
          "text": """{{ctx.payload.hits.total}} errors found.

{{ctx.payload}}
"""
        }
      }
    }
  },
  "throttle_period_in_millis": 180000
}

I managed to fix it using aggregations, pretty neat. :slight_smile:

{
  "trigger": {
    "schedule": {
      "interval": "1m"
    }
  },
  "throttle_period" : "1m",
  "input": {
    "search": {
      "request": {
        "search_type": "query_then_fetch",
        "indices": [
          "services-*"
          //more wildcards here
        ],
        "rest_total_hits_as_int": true,
        "body": {
          "query": {
            "bool": {
              "must": [
                {
                  "query_string": {
                    "query": "level:Error"
                  }
                },
                {
                  "range": {
                    "@timestamp": {
                      "gte": "now-1m"
                    }
                  }
                }
              ]
            }
          },
          "aggs": {
            "by_index": {
              "terms": {
                "field": "_index",
                "size": "100"
              }
            }
          },
          "_source": [
            "message"
          ],
          "sort": [
            {
              "@timestamp": {
                "order": "desc"
              }
            }
          ]
        }
      }
    }
  },
  "condition": {
    "compare": {
      "ctx.payload.hits.total": {
        "gt": 0
      }
    }
  },
  "actions": {
    "send_email": {
      "email": {
        "profile": "standard",
        "to": [
          "account@hostname"
        ],
        "subject": "Log watcher [{{ctx.metadata.name}}]",
        "body": {
          "text": """{{ctx.payload.hits.total}} errors found.

{{#ctx.payload.aggregations.by_index.buckets}}
  {{doc_count}} error(s) in {{key}}
{{/ctx.payload.aggregations.by_index.buckets}}
"""
        }
      }
    }
  },
  "throttle_period_in_millis": 180000
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.