I'm trying to cheaply get some numeric data out of our logs. There an event that we log every time it happens and I'd like to see the rate at which it happens. I've added a lens with a filter for that log message and a formula:
counter_rate(count())
counter_rate() is not designed for it, but it seems to works more or less. However on the chart I anyway see a cumulative value for every bucket, i.e. the bigger the bucket, the greater the value:
Is there still a way to get what I want by not employing full-blown metrics? I.e. to calculate the rate of log entries by a certain filter.
Just to see if I understand this, you want the number of times a specific log event happens over time? With that you could just use count(), instead of counter_rate()?
Or are you dependant on some numerical value from each event?
Hi @andreycha
As @Marius_Iversen said if you just want a rate of event it is just a count() but the under advanced you can set normalize to events/min or events / sec ...
Which is actually the rate.
One more question though: how can I see "max" rate for every bucket? As far as I understand, normalize just calculates the rate based on the count of the events and bucket size. So there is still a problem, that I see some value for a bucket:
Thank for the answer, Marius. Unfortunately, I only have these four functions available. As far as I understand, all the other functions including max() require a field name as argument.
So formula editor also gives an error on max(count()). (I'm using Kibana 7.17 if that's important.)
I rather need something like max(rate) where rate is calculated for every second rather than for current bucket size.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.