Calculate duration between two events using dates

Hello,

In our actual logs, we sometimes need to calculate the elapsed time between two events.

We don't want to use the "elapsed" function in Logstash because it's not always needed and because elapsed time may ocasionnaly have to run during long periods waiting for the second event to appear.

I manage an MFT platform which purpose is to receive files from a source and pushing them to a specific target : source ==> Platform ==> target
The idea is to use our multiple date fields added in our events to calculate elapsed time between an entering file and the outgoing file.

Events are associated by a transfer_id, so a search request on that ID will give us the two corresponding events.
Events also have an identifier to know if it's an entering transfer or an outgoing transfer.

Each event has a "start_date" and an "end_date"

The idea is to calculate the duration between "start_date" of the first transfer and the "end_date" of the second transfer.

Do you think it's possible to do this easily in Kibana ?

The final goal is to obtain a time series visualisation, showing the evolution of the transfers duration from source to target (by joining the events together through their common transfer_id)

EDIT: If not possible by joining events on a common field, I might have a solution to have both start date of the first event and the end date of the second event, in an only one event.
But then I'll still have to find a way to calculate duration between the two dates.

Thanks for advices.

Hi, I'm just jumping in with a left-field idea, because this post has not gotten a response yet.

In our actual logs, we sometimes need to calculate the elapsed time between two events.

Well, this seems a little nightmare-ish. State is not something that should be kept in a logging system. The things you need from a logging system are to be resilient and correct. If it is coming up with its own original data, that data is separated from the original system and will probably not be useful in tracking back things that happened in the system, which is primarily what logs should do. You would never expect to see wrong information in the logs depending on the uptime of the logging pipeline, but that is likely to happen by solving the problem this way.

You are welcome to bang your head against this all you want, but nightmares aren't fun.

Users who end up being the most successful with Kibana have all their data containing the fields it needs before Kibana is introduced to the data.

1 Like

I recently ran into a similar issue, I used a few logstash filters to convert the fields into date/time data types, then converted those fields into epoch and ran a simple math operation to find the difference.

#Convert to date/time type
  date {
    match => [ "opened", "yyyy-MM-dd HH:mm:ss" ]
    target => "opened"
  }
  date {
    match => [ "closed", "yyyy-MM-dd HH:mm:ss" ]
    target => "closed"
  }

#Convert opened date/time field to epoch (seconds)
ruby { code => 'event.set("opened_epoch", event.get("[opened]").to_i)' }

#Convert closed date/time field to epoch (seconds)
ruby { code => 'event.set("closed_epoch", event.get("[closed]").to_i)' }

#Subtract open from close to find duration
ruby { code => 'event.set("duration", event.get("closed_epoch").to_i - event.get("[opened_epoch]").to_i)' }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.