Adjusting a ISO8601 @timestamp field

Hi all,

I'm a bit stuck and confused about how to use the Logstash date plugin for what I'm trying to do.

My situation is such that I have incoming data (coming from kafka input) populating the @timestamp field in ISO8601 format, but the time is actually local time, not UTC. So when I write this data to Elasticsearch, it see's the Z (I assume) and assumes it's UTC, then Kibana shows it 8 hours earlier than the actual local time (i'm in America/Los_Angeles timezone).

What I'd like to do is simply add 8 hours to @timestamp field to make it true UTC time. That, or create a new field that transforms the @timestamp value to proper UTC by adding 8 hours. But no matter what I try, I can never seem to change that value.

I've tried the following:

date {
  match => ["@timestamp", "ISO8601"]
  timezone => "America/Los_Angeles"
  target => "timestamp_utc"
}

date {
  match => ["@timestamp", "ISO8601"]
  timezone => "UTC"
  target => "timestamp_utc"
}

grok {
  match => { "@timestamp" => "%{TIMESTAMP_ISO8601:timestamp_utc}" }
}
date {
  match => ["timestamp_utc", "ISO8601"]
  timezone => "America/Los_Angeles"
  target => "timestamp_utc"
}

Somehow none of these seem to do anything. @timestamp never changes, and timestamp_utc is always exactly the same as @timestamp, which is local Seattle time.

"@timestamp": "2020-02-11T15:13:14.000Z",
"timestamp_utc": [
  "2020-02-11T15:13:14.000Z"
]

I feel like I'm missing something obvious, I know I've fixed stuff like this before, but this time it's eluding me. Any help would be greatly appreciated!

It may be that @timestamp is a LogStash::Timestamp rather than a string, and a date filter cannot parse that. I would

mutate { convert => { "@timestamp" => "string" } }

You may also need

mutate { gsub => [ "@timestamp", "Z$", "" ] }

since the presence of the Z will cause the timezone option to be ignored. Then you can use a date filter.

Thanks for the reply! These suggestions helped me get most of the way there.

This was the block that ended up letting me finally at least adjust the @timestamp value into a new field called timestamp_modified:

mutate { copy => { "@timestamp" => "timestamp_modified" } }
mutate { convert => { "timestamp_modified" => "string" } }
mutate { gsub => [ "timestamp_modified", "Z", "" ] }
date {
  match => ["timestamp_modified", "yyyy-MM-dd'T'HH:mm:ss'.'SSS"]
  timezone => "UTC"
}

The last problem I'm having is actually getting Kibana to show the data in my local time. I've tried changing the timezone to a bunch of different things (UTC, America/Los_Angeles, +0800), all of which I see the timestamp_modified value change, but no matter what, when the data is rendered in Kibana using the timestamp_modified as the index time in the index pattern, it's always 8 hours behind when visualizing.

If the value of @timestamp is my true local time (America/Los_Angeles), what timezone should I be changing the timestamp_modified field to in order to render the data properly? Again, I feel I'm so close, but overlooking something obvious, heh.

Thanks a bunch for all the help.

elasticsearch always stores dates in UTC. The timezone option on the date filter is used to tell it what timezone the logs are written in. kibana will, by default, shift the UTC timestamps from elasticsearch to the browser's local timezone, although this can be disabled.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.