I have the same three values reoccurring every 10 seconds as shown in Discover:
(Here was a picture, I couldn't post, because I'm a new user.)
The sum should add up to 1.642. This also shows in lens in an area visualization:
(Here was a picture, I couldn't post, because I'm a new user.)
You see the simple sum aggregation over one (redacted) field. Inspecting the data, also shows the right value of 1.642 (No rounding or formatting is defined.).
(Here was a picture, I couldn't post, because I'm a new user.)
But looking at the request response, I see a rounding error:
I'm unsure, If I should post the other screenshots in replies one at a time, because I don't want to break any forum rules.
If someone can give me the appropriate rights, I can edit my initial post.
I'm sorry you had trouble uploading screenshots; I'm not familiar with the forum rules for new users.
Let me make sure I understand your question, though.
It sounds like the correct value of 1.642 is showing in Lens everywhere, but then you see the trailing 000...001 in the response in the inspector. Is that right?
In Discover, open up the document and look at the actual JSON and Fields see if they all show the exact values...
Discover does some niceties in the table sometimes...
I ran into something like this a while back and see if I can find it... that turned out to be actually a JSON issue... Floating Point representation in JSON...
I will see if I can find the issue I saw before ... may not be the same but was similar.
Mind that a double type has still a rounding error which can reflect in that 00...0001 when performing math operations over fractional values.
If you click Open in Console within the Inspector request panel and execute the query, does it return this rounding issue? If so, then the source of the problem is in the mappings and Elasticsearch.
One possible workaround I can suggest is, if you know already that your values always have a 3 digits precisions max, is to configure a 3 digits number format within the Lens visualization.
Hi Marco and thank you for your response. When I execute the aggregation in the console, I get the same occasional error. In my case the number only has four digits and I make a simple sum aggregation. I think this should not run into any double rounding errors since I don't calculate fractions.
I already thought of the workaround with a round aggregation, but wanted to report the bug anyway. When you say it is a problem on the elastic side, do I have to change the tags on this post?
Result of the aggregation execution in the console:
This depends on many things (i.e. java version used by ES who got upgraded, change of other internals, etc...) but in general doing a sum of fractional numbers is always subject to this kind of rounding issues.
A Java double is similar on how numbers are handled in JS. If you try in the console 1.01 + 0.084 + 0.548 then it returns 1.6420000000000001. In the link above it explains a bit how double precision numbers work in JS, but the formula is pretty similar in Java too.
As said, Java uses a similar representation but not identical, so there are stil subtle changes between the two.
Using a float in the mapping will lead to a less precision (7 fractional digits vs 15 of double), but it does not guarantee that the problem won't appear at the 7th digit.
Using a integer in the mapping is the approach often used when dealing with scenarios where precision is required. In this case values are scaled (i.e. * 1000 if 3 digits are required) when stored and scaled back when represented. An example use case of this is when dealing with money ( cents representation * 100 stored, then cents/100 back to visualize).
Thank you for your explanation. As you can imagine, this is not very satisfying, as we now have to implement the "round aggregation"-workaround into the failing diagrams. I had hopes to get the old behavior back.
You can try to ask into the Elasticsearch section, maybe they can provide an alternative approach, as there are some special handles I may not know there.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.