Hey all,
is there a good way to be able to run a range facet with X number of
buckets, but not specifying the size of each bucket? Currently, I am doing
a statistical facet, and then doing some math with the mean and std
deviation to construct the buckets, and then running a range facet.. but
this is not desirable since it requires 2 queries.. any thoughts?
I also have the same requirement and have not been able to solve it
with the existing tools in Elasticsearch.
My approach works well in my system since the range of values is
constant. Given the historical data, I break the overall range down
into dozens of sub-ranges which I make into range facets. I then
assemble these facets into a pre-defined number of buckets on the
client side. Not very efficient, but it works. A custom collector
could be used, but I did not want to have to support it with every new
Elasticsearch release.
Hey all,
is there a good way to be able to run a range facet with X number of
buckets, but not specifying the size of each bucket? Currently, I am doing
a statistical facet, and then doing some math with the mean and std
deviation to construct the buckets, and then running a range facet.. but
this is not desirable since it requires 2 queries.. any thoughts?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.