Converting timezones in Logstash - HOWTO

Hi,

After trawling a lot of the internet asking how to convert a timestamp from UTC to local time in Logstash I came up blank, and against a whole bunch of answers on here saying "don't - leave that to the presentation layer".

If, however, you're using say stdout or file output, then Logstash is your presentation layer so it might be helpful to know how to do it.

In the end, I've come to a solution using a Ruby filter that calls into Java to do this, and I thought I'd share it so anyone else on this quest at least finds one way it can be done.

So in a file local_date.rb:

def register(params)
    @tz = java.time.ZoneId.of(params["timezone"])
    @date_format = java.time.format.DateTimeFormatter.ofPattern("yyyy-MM-dd")
end

def filter(event)
    zoned_dt = java.time.ZonedDateTime.ofInstant(event.get("@timestamp").to_java.toInstant, @tz).truncatedTo(java.time.temporal.ChronoUnit::SECONDS)
    event.set("@timestamp_local", zoned_dt.format(java.time.format.DateTimeFormatter::ISO_OFFSET_DATE_TIME))
    event.set("@date_local", zoned_dt.format(@date_format))
    return [event]
end

then use it in a config like this:

input {
    generator {
        lines => [
            "hello",
            "world",
            "this is lots of loggy"
        ]
        count => 1
    }
}

filter {
  ruby {
    path => "/path/to/local_date.rb"
    script_params => {
        timezone => "Pacific/Auckland"
    }
  }
}

output {
  stdout {
    codec => line {
      format => "[%{@timestamp_local}] %{message}"
    }
  }
}

I hope someone finds this useful! Or if there's a better way, I'm all ears. There's not a lot of API documentation around the internal datetime representation, so I just went off the code. It didn't look like there was a way to do proper timezones inside the Ruby realm with the gems that were available.

1 Like

LS is using @timestamp from source if is provided or by default from the host where LS has been running.
ES is using UTC,so LS will always send date fields in UTC format.

If source need to change a time zone, there is the date plugin and you can set your timezone of data.

    date {
       match => ["timestamp", "ISO8601"]
       timezone=> "Pacific/Auckland"
       target=> "yourfield" # default is @timestamp
    }

So if you have 10 fields, you will have 10 date conversions with timezone as you wish.

Kibana will present data based your local time settings or you can set the same time as at source.

Sure, except in this instance there's no Kibana, no Elasticsearch. Just Logstash.

The date filter interprets which timezone the incoming date is in, not the output. The output as you say is always UTC.

I had UTC coming in, stored in @timestamp, and wanted to write to log files using local timezone, which is what my solution is for.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.