this issue came when i trying to find a top ten summary amount with aggregation by term variable A.
variable A have a lot of unique data . that makes the result of the top ten summary amount is invalid.
the first thing i have try is to using 1000 size but the result still invalid.
After that i try using 30000 shard_size and make the size become 10 and the result is valid.
its solve my problem right now .
but why 30000 shard size . how to calculate the exact shard_size to use in aggregation term ?
notes : my variable A unique data is 6.800.000 unique data .
iam so sorry to be honest i dont get what this means,
is single shard means shard_size =1 ?
but when shard_size below 30.000 the result of aggregation is invalid .
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.