Hi,
im trying to get the average of the time difference between 2 time fields with the following query:
GET x/_search
{
"_source": ["ts", "@timestamp"],
"query": {
"match_all": {}
},
"sort": [
{
"@timestamp": {
"order": "desc"
}
}
],
"size": 1,
"script_fields" : {
"test1" : {
"script" : {
"lang": "painless",
"source": "(doc['@timestamp'].value.millis - doc['ts'].value.millis) / 1000"
}
}
}
, "aggs": {
"avarge_delay": {
"avg": {
"script": {
"source": "(doc['@timestamp'].value.millis - doc['ts'].value.millis) / 1000",
"lang": "painless"
}
}
}
}
}
the "test1" scripted field gives the wanted result. but the aggregation is way off of the average (i gave it size =1 ).
as you can see i used the exact same script
if i return only one document size=1 i get test=4000 and the aggragated value is 1000 times larger
ill be happy if anyone can point me to my mistake
thats when quering a size=1