Aggregation and estimation

I'm searching an index with over 110m lines of data and doing large sums across this data.

I know that Elastic uses estimation to avoid having to sum every item in a search. This is a necessary evil for us, as the time cost of summing the whole column is so great that our users wouldn't accept it anyway.

However, this causes us a problem when we have a very large negative number, for instance, we have entity in the data that published a $25bn positive transaction immediately followed by a -$25bn transaction, which exactly offset each other.

The problem is that Elastic is summing the former but not the later. Is there anything we can do to prevent this sort of large scale error occurring in our aggregations (other than forcing full accuracy on our aggregations)?

w

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.