Create 1 alert monitor for missing logs on multiple index

v7.8.0

I have a monitor setup on one index that basically checks if the count of a log is not 0 for the past 10 minutes. I just pick one index and setup the query like below. I set the trigger condition to (ctx.results[0].hits.total.value == 0), If no "Successful transaction." logs are found then I get alerted in a slack channel

{
    "size": 0,
    "query": {
        "bool": {
            "must": [
                {
                    "match_all": {
                        "boost": 1
                    }
                },
                {
                    "match_phrase": {
                        "message": {
                            "query": "Successful transaction.",
                            "slop": 0,
                            "zero_terms_query": "NONE",
                            "boost": 1
                        }
                    }
                },
                {
                    "range": {
                        "@timestamp": {
                            "from": "now-10m",
                            "to": "now",
                            "include_lower": true,
                            "include_upper": true,
                            "boost": 1
                        }
                    }
                }
            ],
            "adjust_pure_negative": true,
            "boost": 1
        }
    }
}

Now my problem is I want to create the same alert for other indices but if I do that and there was an issue with logs coming in to kibana then I'll get spammed by messages from multiple monitors, is there a way to monitor multiple indices with just one monitor and I get alerted like "This index has no logs..."

maybe model your Watch after this example:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.