Hi,
Elastic.co claims that ML reduces false positive but doesn't tell how.
To explain my question let's consider this scenario:
My aim is to monitor -using ML and ELK stack- a Java application installed on a Linux Host.
Let's say that suddenly there is a lot of traffic generated by/within the app (ex: a lot of visitors connecting to the GUI, JMS messages goes up,...), this means for example that the RAM usage (it can be the JVM instead, but let's keep the RAM) will grow significantly!
Is there any ML job applied to "RAM-used metric" that if the RAM usage grows and the traffic generated grows also, ML considers that situation normal and doesn't shoots a notification or consider it an anomaly ?!
Another general question: Can we (via api for example) tell the ML that a generated anomaly is a false positive and so delete it ?
With regards,
ML helps reduce false positives over other techniques (i.e. static threshold alerts or simplified stats like standard deviations) because of the very nature of the approach, which is to only "alert" when the behavior of something is statistically significantly different than it usually is. I put "alert" in quotes because ML isn't doing the alerting, the integration with X-Pack Alerting ("Watcher") is the mechanism to actually alert.
Your second question about doing an AND on two analyses (RAM usage and traffic) is currently best solved via a chained-input Watch in which you first query the results of the 1st ML job, and then use the contextual information (i.e. the hostname that the anomaly is for) to subsequently query for anomalies in the 2nd ML job for that entity - then only alert if both conditions match.
There is no current facility to tag an individual generated anomaly as a false positive. What's the concern here?
Thank you for your response.
For my last question, I thought it would be a good idea to mark a generated anomaly as a false positive in case of a known and unusual activity that we forgot to declare in "Calendar & scheduler events".
Ok - thanks for clarifying. We'll take that suggestion into consideration.