Calculate time difference between 2 lines in log file

Hi everyone,

I retrived a log file that look like this :

2019-03-07 04:35:57.421     19EC | INFO    ROOT:                   (Netto2600-Brutto2608)AllocatedBlockCount=0
2019-03-07 04:35:57.421     19EC | INFO    ROOT:                   (Netto2600-Brutto2608)ReservedAddressSpace=0
2019-03-07 04:35:57.421     19EC | INFO    TIMEDATA.DLL:         Capacity allocated: Hashtable entries 20648881, Heap mem 805306368
2019-03-07 04:35:57.421     19EC | INFO    DPREAD:               Data import started: 07.03.19 04:35:57
2019-03-07 04:35:57.421     19EC | INFO    DPREAD:               Importing file: C:\path\to\file1
2019-03-07 04:35:57.452     19EC | INFO    DPREAD:               Completed importing file: C:\path\to\file1
2019-03-07 04:35:57.452     19EC | INFO    DPREAD:               Importing file: C:\path\to\file2
2019-03-07 04:36:43.545     19EC | INFO    DPREAD:               Completed importing file: C:\path\to\file2
2019-03-07 05:38:55.332     19EC | INFO    TMDPDATA-INIT:        Datasupply info: vwdpm.dcsDefault.2.0
2019-03-07 05:38:55.520     19EC | DEBUG   TMDPDATA-GETFILEDATE: MdpFileIdentifier=[+]
2019-03-07 05:38:55.520     19EC | DEBUG   TMDPDATA-GETFILEDATE: FileDate=[06.03.19]

What i want to do is to calculate the difference between the time of "Importing file" and "Completed importing". Could somone help me on this? Should i use logstash filter (like aggregate or elapsed)? Is this done directly in filebeat?

Thank you for your help :slight_smile:

1 Like

For Logstash the elapsed filter plugin would be what you're looking for.
You need to add tags to your events (start/end) and an ID (maybe your file name?) via some other filter means (possibly a grok or better yet dissect filter).

Filebeat cannot do that, you'd need to do this in an ingest pipeline instead. However I think there is no equivalent processor available so far: https://www.elastic.co/guide/en/elasticsearch/reference/master/ingest-processors.html

Hello BennyInc

I tried something like this :

grok {
	match => ["message", "%{DATE_EU:date} %{TIME:hour} +%{DATA:test} \| %{LOGLEVEL:level} +%{DATA:type}: +%{DATA:state_import}: %{GREEDYDATA:url}"]
	add_tag => [ "%{state_import}" ]
}


elapsed {
    unique_id_field => "test"
    start_tag => "Importing file"
    end_tag => "Completed importing file"
    add_tag => ["time_elapsed"]
  }
  
   if "time_elapsed" in [tags] {
    aggregate {
      task_id => "%{test}"
      code => "map['report'] = (event.get('elapsed_time')*1000).to_i"
      map_action => "create"
    }
} 

But in Kibana, i have "elapsed_time" but it is set to 0, which should be not 0 and the time difference

You need to parse the timestamp from the log before trying to use it in the elapsed filter.

    grok {
        match => ["message", "%{DATE_EU:date} %{TIME:hour} +%{DATA:test} \| %{LOGLEVEL:level} +%{DATA:type}: +%{DATA:state_import}: %{GREEDYDATA:url}"]
        add_tag => [ "%{state_import}" ]
        add_field => { "[@metadata][ts]" => "%{date} %{hour}" }
    }
    date { match => [ "[@metadata][ts]", "ISO8601" ] }

What are you hoping to do with the aggregate filter?

Hi,

I follow this link : https://stackoverflow.com/questions/37353365/calculating-time-between-events/37359000

My aggregate filter will take the elapsed_time and will convert it to ms. Is it useless to have an aggregate filter in my case?

Thank you for your help

If you just want to convert seconds to millisecond a ruby filter will do it.

    if "time_elapsed" in [tags] {
        ruby { code => 'event.set("elapsed_time_ms", 1000.0 * event.get("elapsed_time"))' }
    }

In that stackoverflow answer the re-write of the initial solution using the event API is missing an essential event.set, so the aggregate filters are not doing anything.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.