M trying to visualize a bar chart where on metric i have sum of a field and for bucket its date histogram for the timestamp field and second interval but the interval is auto set to hour cause of too many buckets.
How can i get the sum of the fields for every second?
Using some test data, when my time range is 15 minutes, I see the warning that it's been scaled larger because of too many buckets, but if I drill down into a shorter time range, It will show me seconds.
I have a transaction logs data with timestamp, processing time, and message size data and what i want to see is max number of transactions occurring per second within the minute/hour /day.
But when ever i choose the interval as second i tells that bcoz of too many buckets it set the interval to an hour.
As you are looking for the max per second interval, any direct aggregation will generate tons of buckets and cause memory problems when querying any longer time period. One approach to get around this would be to run an aggregation over the last minute every minute and calculate the max here before storing this data as summary records in a separate index. You can the do a simple max aggregation against this instead, which would be more efficient as longer intervals could be used and still give the correct max value for that interval.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.