I have a metric threshold alert that will trigger when document count is above 30. This alert seems to trigger just fine. For the body I'm setting this:
And this is the data that I get back when the alert fires up:
{"alertId":"SOMEMADEUPID","alertName":"Aborted Alert","spaceId":"default","tags":["Dev"],"alertInstanceId":"US0418,TPASDEMO","alertActionGroup":"metrics.threshold.fired","alertActionGroupName":"Alert","context":{"group":"US0418,TPASDEMO","alertState":"ALERT","reason":"Document count is 62,474 in the last 2 hrs for US0418,TPASDEMO. Alert when > 30.","viewInAppUrl":"SOMEURL","timestamp":"2023-11-01T19:29:19.858Z","value":{"condition0":"62,474"},"threshold":{"condition0":["30"]},"metric":{}},"date":"2023-11-01T19:29:23.552Z","state":{"start":"2023-10-17T17:54:53.840Z","duration":"1301364441000000"},"kibanaBaseUrl":"SOMEBASEURL","params":{"criteria":[{"comparator":">","timeSize":2,"aggType":"count","threshold":[30],"timeUnit":"h"}],"sourceId":"default","alertOnNoData":true,"alertOnGroupDisappear":true,"groupBy":["labels.storeName","labels.retailer"],"filterQueryText":"labels.http_route: "/pos/order/{orderId}/{version}/void" and url.path : *"},"rule":{"id":"SOMEID","name":"Aborted Alert","type":"metrics.alert.threshold","spaceId":"default","tags":["Dev"]},"alert":{"id":"US0418,TPASDEMO","actionGroup":"metrics.threshold.fired","actionGroupName":"Alert"}}
The document count is an unusually large and incorrect number. Haven't quite understood why it's reporting that number and where it's coming from. Any insight would be appreciated