Is there a way to adjust aggregation precision thresholds? If I run an aggregation on a value that is smaller than 0.1 it always returns 0. I would like it to not return 0.
For example if I run a max, sum or avg on the values 0.052, 0.047 and 0.025 Elasticsearch will return 0.
After pulling my hair out over this I have finally resolved this. The issue was that when the summary statistics would come in and create the new index the first one always seemed to have a count of 0 and therefore 0s in mean, min, max and stddev. So, the mapping was getting created as a long rather than a float. I adjusted the template to force it to a float, deleted the old indices and everything looks to be working as expected now.
The weird part is that the discover page and json would return the full number with all the decimal places but when trying to use the long value in a mathematical (such as average or max) agg it seems to truncate it to 1 decimal place. So numbers larger than 0.1 would show up but smaller numbers would show up as zero.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.