Measure for uncertainty of the Anomaly Detection algorihtm


We are trying to find anomalies in people's daily power usage using Elastic's Anomaly Detection.

Of course, in order to detect a day-night cylce, the algorithm first needs some data before it can recognize patterns.
However, the time it takes before such a cycle is picked up can differ a lot.
Sometimes the model needs only 2 days worth of data, and sometimes it does not happen at all. See the images below.

In the second image in the middle a clear point can be seen at which the model "understands" the data. Is there a way to automatically detect this change in behaviour?

I was thinking about analysing the rate of change in the model bounds, but have not found a way to retrieve these when there are no anomalies. I also noticed that the forecast-functionality does something similar when it checks whether enough data is provided. Is it possible to run such a check without the forecasting?

Thanks in advance,

Jorrit van der Laan

There will be an upcoming feature in which significant model changes will be published to ML's Annotations:

Looks interesting! Do you know when this upcoming feature would be released?

We cannot commit exactly, but possibly v7.9.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.