Machine Learning - Watcher - Not Firing

So I've ben able to get some watchers to fire at a more consistent rate so thats a step in the right direction. What I'm learing however is that if you use one or just a few boxes to turn metricbeat on and off the ML program pick up on that and they don't produce critical errors to easily sort the servers by in the alerting dashboard. This seems a little odd with the ML metrics due to the fact that the server is not reporting at all, which should be a critical error, yet ML only produces the lowest level warning of, warning. I do get how testing this multiple times effects ML but it still seems odd to me that critical warnings are not produced when metricbeat stops reporting. I'll try to post what I have once I fine tune it all.