How do gauges and ranges work?


The gauge and range option are making no sense to me and there is nothing in the Docs that explains how this is supposed to work.

Lets say my metric is the SUM of data usage, so I get something like 30MB or whatever. With the default range setting the whole gauge turns red, which makes no sense as the ranges are divided as 0 - 50, 50 - 75, 75 - 100. Common sense would assume the result to be within 0 - 50.

The gauge and range don't appear to use the field type to their thing.

How is this supposed to work?

The provided ranges are just a starting point. You have to adjust the ranges to make sense for your data set. I think part of the confusion is coming from the fact that your gauge is displaying formatted values like '30MB'. The raw value is really 30,000,000 (more or less). You will need to set your range values to the raw value and not the formatted value.

Below is an example showing the usage of range and numbers formatted by Bytes. Any Sum of bytes between 0 and 26000000 are green and any Sum of bytes between 26000000 and 1000000000 are red.

Hi Nathan,

Thanks, that makes sense. I will try it out.

Shouldn't this sort of basic information be in the Docs? It makes sense when you put it like that but as you can format field values its not strange to think those formatted values automatically apply to ranges too I think.

Thanks for your feedback. I have created a pull request to update the documentation with this information.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.