Problem is described as it follows:
I'am querying the number of documents of a given "type" attribute for a time interval, let say 1 week, i'am first having around 600 documents.
On the other hand, i want to retrieve exactly 10 elements in my bucket from this aggregation, therefore, i make sure this size is less or equals to the number of available documents for that particular time range.
given that, i subdivide one week in milliseconds (604800000) by 10, getting an interval of 60480000.
then, comes the histogram aggregation,
{
"size": 0,
"aggs": {
"sensors_aggregation": {
"histogram": {
"field": "@timestamp",
"interval": 60480000,
"min_doc_count": 1
},
"aggs": {
"sensors.high_avg": {
"avg": {
"field": "sensors.high"
}
},
"sensors.current_avg": {
"avg": {
"field": "sensors.current"
}
}
}
}
},
"query": {
"bool": {
"filter": [
{
"term": {
"type.keyword": {
"value": "sensors"
}
}
},
{
"range": {
"@timestamp": {
"format": "epoch_millis",
"gte": 1568232646238,
"lte": 1568837446238
}
}
}
]
}
}
The result is somewhat unexpected for me at this point, the reason being the size of the bucket is 3, and has between 100 and 300 documents count for each elements of the bucket.
My expectation is, having 10 elements in my bucket from this query, and having an average of the sub fields for a total of 600 valid documents.
What am i doing wrong ?
Thanks