Unable to setup an alert on log data not received from host for last 1 hour

Hi , I am trying to setup an alert when a host has stopped sending logs. I would like to group by hosts and check if log count is zero.

The UI is mandating at least 1 condition. Tried setting host.name is , but it literally looking for "" as host name and triggering alert since no hosts match that name.

Is there a way to specify a wildcard

Screenshot 2021-04-07 at 8.05.59 PM Screenshot 2021-04-07 at 8.05.41 PM

Hi @elango1,

the IS operator indeed only looks for exact matches in keyword type fields. I filed a feature request for partial match a while ago, so it's in our backlog: [Logs UI] Add partial match operator to alerting conditions for keyword fields · Issue #74130 · elastic/kibana · GitHub.

Until then wouldn't specifying WITH host.name IS NOT this_is_not_a_valid_hostname lead to desired results? The conditions should evaluate to true for every log entry so the result would include all documents.

There is one additional caveat with checking for a zero log count: If there are no documents form a host at all, then there won't be a group for that host to alert on. To get around that we're collecting the list of hosts looking an additional 1 hour (in your example) into the past. Any host that hasn't sent any data for longer won't trigger the alert (because the system doesn't know it exists).

Thanks Weltenworth.
This would suffice my needs, the solution looked liked a hack and just want to make sure if there wasn't any feature that i was not aware of.

Thanks for the catch on hosts not sending data.

May be the log rate metric would ideal to build this alert or leverage on the Machine learning job.