Hi,
I've got a question regarding the scoring system of ELK anomaly detection.
I've been running a job with 'high_sum' detector, and there was an anomaly with record score 99 but actual value of the sum of the field was 0.
The field the detector referst to is 'bytes' so 0 is the smallest value.
It's clearly not a high_sum anomaly, still such thing (detecting buckets with actual value=0 as high anomaly) happens time to time.
If possible, I recommend you upgrade to the latest (v8.2) as there have been several bug fixes in this area since 7.16. There were cases where the modeling would have stability issues (a big spike followed by several empty bucket spans, for example).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.