Background
Clients connect to my Websocket-Server. I have logs for both join
& disconnect
events in the following format (logs redacted):
{
"_index": "filebeat-7.7.0-2020.05.22-000001",
"_type": "_doc",
"_id": "5CAqS3IBb2WkqqHd1BPf",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2020-05-25T09:29:48.389Z",
"message_decoded": {
"data": {
"uuid": "b45739b9-8323-43e5-b8f1-50abb67962e9"
},
"type": "join"
},
"fields": {
"@timestamp": [
"2020-05-25T09:29:48.389Z"
]
},
}
The disconnect
-log looks similar. Importantly, I have the @timestamp
and uuid
fields.
What I'm trying to do
I would like to compute the average session duration for my clients. Filtering for any uuid
yields two documents (uuid
is session unique), the difference between the two @timestamp
s is what I am trying to compute.
I want to plot session durations in a line chart / vertical bar chart. I also want to compute the average session duration.
What I have come up with so far
I started by bucketing uuid
s using the Terms
-aggregator. This creates buckets of size 2
.
Next, I need to somehow reduce each bucket into an average session duration. However, I cannot seem to find a metric
that does this in the Kibana interface. I have looked into the following options:
- Use a
painless
-script to compute the date difference. However, I am unsure how to use a script to compute over an entire bucket. The examples I find compute run a script per document. I have seen that I can include scripts as customJSON
inside Kibana, but am still unsure as to what that script would look like. - Use Bucket Script Aggregation. However it says in the documentation:
The specified metric must be numeric and the script must return a numeric value.
My question
What is the easiest way to compute session durations, as outlined above? Could anybody point me in the right direction here?