How to index nanoseconds precision events with Logstash (7.10) and type date_nanos

Problem statement

Workaround
I'll share here how I've managed to do it but I'm happy to hear better options.

Step #1 Convert your nanoseconds timestamp from number to date
(ignore this step if you already have your timestamp in this format)
Logstash filter date does not support nanoseconds Epoch UNIX_NS conversion yet so we do the conversion using Ruby code.

  # Example date "2009-02-13T23:31:30.123456789Z"
  mutate { add_field => { "@timestamp_source" => "1234567890123456789" } }
  ruby {
    code => "
        event.set('[@timestamp_nanoseconds]', (Time.at( (event.get('[@timestamp_source]')[0...10]).to_i ).to_datetime).strftime('%Y-%m-%dT%H:%M:%S.') + event.get('[@timestamp_source]')[-9..-1] +'Z' )
        event.set('[@timestamp]', LogStash::Timestamp.new(event.get('[@timestamp_nanoseconds]')) )
      "
  }

We get @timestamp_nanoseconds in date string format and native @timestamp in milliseconds.
Output:

{
         "@timestamp_source" => "1234567890123456789",
                   "message" => "",
    "@timestamp_nanoseconds" => "2009-02-13T23:31:30.123456789Z",
                "@timestamp" => 2009-02-13T23:31:30.123Z,
                      "host" => "local",
                  "@version" => "1"
}

Step #2 Add to your index template date_nanos format for field @timestamp_nanoseconds

{
  "mappings": {
    "properties": {
      "@timestamp":             { "type": "date" },
      "@timestamp_nanoseconds": { "type": "date_nanos" }
    }
  }
}

Step #3 Index data and create Kibana Index Template selecting @timestamp_nanoseconds as Time Field

The new precision becomes available on Kibana:

I hope it helps.
Thanks!

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.