Kibana Alerting Missing Issue

Hi,

I'm monitoring the log alarm through Kibana.
But there is a problem.

There are times when the alarm doesn't come from Kibana.
When I checked if there was no log, it was there.

The alarm is set to activate once a minute, and the trigger condition is ctx.results[0].hits.total.value > 0.

The alarm setting is as follows.

{
    "size": 1,
    "query": {
        "bool": {
            "must": [
                {
                    "match": {
                        "message": {
                            "query": "Success",
                            "operator": "AND",
                            "prefix_length": 0,
                            "max_expansions": 50,
                            "fuzzy_transpositions": true,
                            "lenient": false,
                            "zero_terms_query": "NONE",
                            "auto_generate_synonyms_phrase_query": true,
                            "boost": 1
                        }
                    }
                },
                {
                    "match": {
                        "message": {
                            "query": "Version",
                            "operator": "AND",
                            "prefix_length": 0,
                            "max_expansions": 50,
                            "fuzzy_transpositions": true,
                            "lenient": false,
                            "zero_terms_query": "NONE",
                            "auto_generate_synonyms_phrase_query": true,
                            "boost": 1
                        }
                    }
                },
                {
                    "match": {
                        "beat.hostname": {
                            "query": "nms",
                            "operator": "OR",
                            "prefix_length": 0,
                            "max_expansions": 50,
                            "fuzzy_transpositions": true,
                            "lenient": false,
                            "zero_terms_query": "NONE",
                            "auto_generate_synonyms_phrase_query": true,
                            "boost": 1
                        }
                    }
                },
                {
                    "match": {
                        "source": {
                            "query": "nms-oplog",
                            "operator": "AND",
                            "prefix_length": 0,
                            "max_expansions": 50,
                            "fuzzy_transpositions": true,
                            "lenient": false,
                            "zero_terms_query": "NONE",
                            "auto_generate_synonyms_phrase_query": true,
                            "boost": 1
                        }
                    }
                },
                {
                    "range": {
                        "@timestamp": {
                            "from": "{{period_end}}||-2m",
                            "to": "{{period_end}}",
                            "include_lower": true,
                            "include_upper": true,
                            "boost": 1
                        }
                    }
                }
            ],
            "must_not": [
                {
                    "match": {
                        "connection_node": {
                            "query": "x-osp-*",
                            "operator": "OR",
                            "prefix_length": 0,
                            "max_expansions": 50,
                            "fuzzy_transpositions": true,
                            "lenient": false,
                            "zero_terms_query": "NONE",
                            "auto_generate_synonyms_phrase_query": true,
                            "boost": 1
                        }
                    }
                }
            ],
            "adjust_pure_negative": true,
            "boost": 1
        }
    },
    "sort": [
        {
            "@timestamp": {
                "order": "desc"
            }
        }
    ]
}

The picture below shows that there was a real log.

I changed the range from {{period_end}}||-2m to {{period_end}}||-5m just in case, but it was useless.

I can't switch log monitoring to Kibana because this problem keeps happening.

Please give me some ideas to fix this problem.

1 Like

Several questions about this:

  1. The picture you show is doing more than just the query you ran. What happens when you filter only using this query?
  2. Your query looks too complicated, and not in a good way. Why are you using match queries instead of match_phrase? It looks like you want exact matches only https://www.elastic.co/guide/en/elasticsearch/reference/current/query-dsl-match-query-phrase.html
  • Are you trying to do something with AND/OR logic? It's not working it that was your goal
  1. Could the time zone setting be affecting the matches? You aren't setting one in your query

Finally, I think we need to see the configuration of what period_end is set to.

Hi,

First of all, thank you for your help.

  1. Using only that query will produce the same results as the picture above.
    The above picture is a saved search for easy viewing of query results.

  2. The reason for using match was those previous operators used match.
    AND/OR logic was used for the same reason.
    I was also considering changing to match_phrase.
    So I am conducting the test by changing to match_phrase.

  3. The reason why I set the time zone in the picture is to quickly see the results of the query.

I didn't set the period_end separately, and I don't know exactly what you're talking about.
Please let me know if there is any way to check the settings.

As you told me, I plan to conduct the test with match_phrase and minimal setting.
I hope this will solve the omission of Kibana alert.

Why don't you try using the Inspect feature of discover to see the exact query DSL that's being sent? My guess is that your Kibana instance has a time zone set that is different from what you're sending in your alerting query.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.