I have a few Elastic-Agents working on metric collection but we got a problem for when one of them goes down. I know so far that there's an alerting option for metrics/logs threshold but I'm not sure about how to set an alert for when there's missing data.
It would be great to have some ideas on this topic.
Would a new terms query in the detection rules be an option?
You could query for the status offline and use new term aggregation on host name.
If you collect system metrics you can just create a metric alert and there's an option to be alerted when it stop sending data.
But seems like there should be an easier way.
I also saw this
@DougR , thanks for posting. Unforutanetly I think there is not such functionality in Kibana. I found this feature request by @Jeff_Vestal
You can create rules based on your agent writes (like the System integration writing at the logs-* data stream with data_stream.dataset: elastic_agent) but I concur that is not ideal.
Example of a single agent writing data in the logs-* data view, filtering to show activity from the agent itself, filebeat and metricbeat
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.