@timestamp in microseconds


(Aharon Haber) #1

Are there any plans to support @timestamps in microseconds (or even nanoseconds)? I am trying to measure elapsed time between two events and I cannot see an easy way to do this in logstash (I am new BTW).

I could of course add timers to my code logs to produce duration numbers but that would defeat (some of) the purpose. I am kind of surprised this is not asked more which makes me think I am missing something. Thanks!


(Christian Dahlqvist) #2

If you are sending your events to Elasticsearch, timestamps there currently only supports millisecond resolution.


(Aharon Haber) #3

Hi Christian, thanks for the link. I had not seen it. On the one hand it is reassuring to see it is being considered. On the other hand looks like will be a long time until available. I do keep a field with time in microseconds. It makes me think with some custom programming (assuming I had the time) I can create my own elapsed plug in to record elapsed time in microseconds. As I said I am trying not to add it to my code which would be the easiest thing as it seems "wrong" to do (and would have to do so for all my events) if I am already using a logging statistics framework.


(Christian Dahlqvist) #4

Doing the calculation at the source has the benefit that it may scale better as you do not need all related events to pass through a single Logstash instance/thread for the calculation, so unless you know your volumes are always going to be low it may be worth considering.


(Aharon Haber) #5

To be clear I was asking for solution using @timestamp because I wanted to use elapsed plugin. Instead I am using my own custom timestamp field.
For others who might be looking for a solution to measuring elapsed time in microseconds (or nanoseconds) between two events. Solution for me was fairly simple using aggregate filter:

  1. I use tags to designate start event and end event (add_tag=> ["STARTTAG"]...

  2. Have a custom field with timestamp in ISO8601 format which includes micro/nano seconds (I called "timestamp") - for both start and end event

  3. designate a field for id for both start and end events (as instructed in aggregate filter documentation)

  4. For start event use "aggregate" plugin:
    if "STARTTAG" in [tags] {
    aggregate {
    task_id => "%{someidfield}"
    code => "map['startstamp'] = event.get('timestamp')"
    map_action => "create"
    add_tag => "metric"
    }
    }

  5. For end event use aggregate plugin as follows (elapsed time converted to microsecond units):
    if "ENDTAG" in [tags] {
    aggregate {
    task_id => "%{someidfield}"
    code => "event.set('Event_Duration_In_Micros',((Time.parse(event.get('timestamp')).to_f - Time.parse(map['startstamp']).to_f)*1000000).to_i)"
    map_action => "update"
    end_of_task => true
    add_tag => "metric" #used for separation later down pipeline
    }
    }

Hope this helps


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.