I'm trying to create a metric for events per second, but not really sure how to do it. I saw one example that seems to get close, but when I manually do the math, it's a little off.
The visualization configuration below is on 10,426 events over a 1 minute interval. Doing the math, that comes out to 173.7666 per second. The visualization gives me 170.918. What am I doing wrong?
You have a few options here, but first let me explain what you're probably seeing.
Notice that your setting is the "minimum interval". This is because Kibana tries to prevent you from making slow queries. Kibana will rewrite your interval if it would produce more buckets than
histogram:maxBars- which defaults to 200. Imagine that you have a 15 minute time period with per-second buckets: this would produce 15 * 60 buckets, which is 300- so instead, Kibana will choose a different interval for you.
So one option you have is to increase the advanced setting
histogram:maxBars. This is a per-space Kibana setting and it will affect all date histograms.
Another option you have is to use TSVB. TSVB also has a bucket limit, but it's 2000 by default. We've written a blog post about rates in TSVB.
Another option you have is to use Vega, which allows you to build Elasticsearch queries using the full DSL. This is a more advanced tool, so it lets you make slow queries.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.