I'm using elastic stack 6.2.4.
I've build a dashboard to support the analysis of our loadtests. In the selected interval I am plotting 1st and last event timestamp, count of events + throughput. But I don't understand the results kibana is giving me, I expect much different results when I am doing the calculation in Excel.
I defined the first part of the aggregation like the following:
PLEASE klick on the images to see the full config. the forum is cutting of the buttom of the images in overview!
The result looks like the following:
I've selected following range in kibana: 2018-11-14 16:08:00.000 to 2018-11-14 16:40:37.312
I know I first had a mind error, I hoped the throughput will be calculated based on min and max timestamp, but on 2nd thought kibana i am quite sure that kibana will take the time window set up in the dashboard.
Lets discuss the first line:
My time window (selected in kibana is 32min and 37sec. That makes 1957 second. In this time we find 181053 events. But kibana tells me, that I have a throughput of 181053 per hour or 109 events per second. If I calculate the avg throughput per second I would calculate 181053 / 1957s = 92.55 events per second. That difference to 109 events per second is way too much to be an rounding issue.
What does kibana calculate or how can I get kibana to calculate it my way?
Btw: If my loadtest is running over the hour change, kibana is dividing the avg count per hour by 2 - the number of involved hours.
Can anyone please shed some light?