Exponential result in kibana metric visualization when expecting zero

Hello,

I am creating some Dashboards in Kibana version 6.8 to compare data from two different environments, depending on the field "tecnologia". In the dashboard I use metric visualizations to count the number of documents in each environment and then I calculate the difference.

In many visualizations it happens that a result that should be 0 shows as an exponential number. See the example:

The field is "SLDCON" and it has the following mapping:

"SLDCON": {
          "type": "double",
          "ignore_malformed": true
        }

And in the pipeline it is converted to float_eu:

mutate {
    convert => {
      "SLDCON" => "float_eu"
    }
  }

I created a scripted field "countGM":

if (doc.containsKey('tecnologia')) { 
   if (doc['tecnologia'].value == "Mainframe") { 
      return -1 
   }else{ 
      return 1 
    } 
} else { 
   return 0; 
}

To calculate the difference between the two environments in a metric visualization:

It is done like this for hundreds of visualizations and it is always giving the correct value when 0, but randomly, on different days it shows this exponential number.

Looking at the response in the metric visualization of the values it is substracting I get that one of them is really "607522597.6600001" and not "607522597.66" as the visualization shows:

{
  "took": 10,
  "timed_out": false,
  "_shards": {
    "total": 8,
    "successful": 8,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 24742,
    "max_score": 0,
    "hits": []
  },
  "aggregations": {
    "1": {
      "value": 607522597.6600001
    }
  },
  "status": 200
}

And:

{
  "took": 4,
  "timed_out": false,
  "_shards": {
    "total": 8,
    "successful": 8,
    "skipped": 0,
    "failed": 0
  },
  "hits": {
    "total": 24742,
    "max_score": 0,
    "hits": []
  },
  "aggregations": {
    "1": {
      "value": 607522597.66
    }
  },
  "status": 200
}

I think this "0.0000001" is causing the exponential number. I have seen in older posts that by using in the script something like:

"script" : "Math.round(doc['NUMBER_TO_ROUND'].value * 100)/100"

The problem gets solved, but that would mean modifying hundreds of viusalizations. Is there any other way to round up the number without having to modify all the existing visualizations?

Thank you in advance

@flash1293 / @Stratoula_Kalafateli help please? Thanks!

You might be able to define a field formatter for the SLDCON field in the index pattern and set it to the appropriate amount of decimal points.

Hi,

Looking deeper while testing, turns out that none of the ingested documents have more than two decimals. It's as if the visualisation was "making them up".

Therefore, a decimal formatter would not fix the issue.

It can still fix the issue - floating point numbers are not 100% precise that's how these very small numbers occur.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.