How to do advanced mathematical computations with ELK?

Hello everyone,

I am wondering if it is possible to do advanced mathematical operations with ELK. Here is an example of what I would like to do:
I have logs from my radio network and for each device i have a frame counter in every packet received. What I want to do is to compute the percentage of packets lost in my network by checking the mathematical difference between frame counters for every device. Currently, i can't do it by aggregration through Kibana.

In the elastic website, I saw that there is the alternative of java native script that can allow me to manipulate my fields and data. However, I have some worries about it. First, I haven't seen if I could display the results in kibana. Secondly, I would like to compute my percentages in function of time and i don't know if it is possible with that way.

Furthermore, if you have another alternative it is fine with me.

Thanks in advance.

Sen

Does anyone have any idea on this topic ??! :sweat: :sob:

Can you give a more concrete example with actual documents and what you are trying to compute?

Hi Adrien,

Let's take the following example:
My json input file is a log from my radio network. You can find the device ID, time and the frame counter (=packet number). Here is an example with two devices dev11 and dev55.
{
"device":dev11,
"time":xxxxx,
"framecounter":125
}
{
"device":dev55,
"time":xxxxx,
"framecounter":74
}
{
"device":dev55,
"time":xxxxx,
"framecounter":86
}
{
"device":dev11,
"time":xxxxx,
"framecounter":129
}

What I want to do is to check the difference between two consecutive received packet for every device (in function of the time!!). In my example, we can see that for dev11 we have 129-125-1=3 packets lost and for dev55 86-74-1=11 packets lost.

I achieved to use ELK in order to receive my log through logstash, to store it in elasticsearc and to show it in Kibana. However, what i want to do can't be done by aggregation..that's why I opened this topic.

Sen

Thank you for the example, I better understand what you want to achieve now. This is not something that you can achieve easily. You could compute the number of lost packets for every device from client side by running a sorted scroll based on the timestamp and filtered by device id, and summing up differences between consecutive values of the frame counter.

In general, elasticsearch cannot run aggregations that depend on a sort order.

Another way to approach this problem would be to have a batch job that periodically reindexes your index into another elasticearch index and replaces the frame counter field with a lost_packets field (number of lost packets since the last document with the same device id) . Then this second index could be used in Kibana and you could run eg. sum aggregations on the lost_packets fields under a date_histogram on the time field in order to see how packet loss goes with time.