I'd like to get the total number of doc_count
values for an aggregation.
At the moment, I have:
{
"size": 0,
"query":{
"function_score":{
"query":{
"bool":{
"filter":{
"bool":{
"must":[
{
"range":{
"created_at":{
"gte":"now-30d",
"format":"yyyy-MM-dd'T'HH:mm:ss.SSSZ"
}
}
},
{
"term":{
"client_id":1245
}
}
]
}
}
}
}
}
},
"aggs": {
"date_range": {
"date_histogram": {
"field": "timestamp",
"interval": "1h"
}
}
}
}
... which returns:
{
"took": 0,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 830,
"max_score": 0,
"hits": []
},
"aggregations": {
"date_range": {
"buckets": [
{
"key_as_string": "2018-07-17T00:00:00.000+0000",
"key": 1531785600000,
"doc_count": 113
},
{
"key_as_string": "2018-07-17T01:00:00.000+0000",
"key": 1531789200000,
"doc_count": 143
},
{
"key_as_string": "2018-07-17T02:00:00.000+0000",
"key": 1531792800000,
"doc_count": 116
},
{
"key_as_string": "2018-07-17T03:00:00.000+0000",
"key": 1531796400000,
"doc_count": 119
},
{
"key_as_string": "2018-07-17T04:00:00.000+0000",
"key": 1531800000000,
"doc_count": 90
},
{
"key_as_string": "2018-07-17T05:00:00.000+0000",
"key": 1531803600000,
"doc_count": 122
},
{
"key_as_string": "2018-07-17T06:00:00.000+0000",
"key": 1531807200000,
"doc_count": 93
},
{
"key_as_string": "2018-07-17T07:00:00.000+0000",
"key": 1531810800000,
"doc_count": 34
}
]
}
}
}
... but I'd like the sum, if possible.
I could perform the calculation in the logic, but it would be pointless if Elastic could do it.
I found Cumulative Sum Aggregation, but that would also require some additional logic to get at the cumulative value.
Any assistance would be much appreciated.