How to set @timestamp timezone?

Hi,

I found that the @timestamp always show UTC time,
My date filter is:
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
timezone => "Asia/Shanghai"
}

after process the timestamp is:
"@timestamp" => "2015-08-27T03:07:56.000Z",
"timestamp" => "27/Aug/2015:11:07:56 +0800",

The timezone is different. I have searched in the website and use: code => "event['@timestamp'] = event['@timestamp'].localtime("+08:00")", but ruby doesn't support localtime method.

Questions:

  1. How can I change the @timestamp match my timestamp?
  2. Can I set timezone in the Kibana?

Thanks!

3 Likes
  1. How can I change the @timestamp match my timestamp?

Please don't. Store the timestamps in UTC and leave the timezone adjustments to presentation layers. You are spending time on solving a problem that doesn't exist.

  1. Can I set timezone in the Kibana?

I believe Kibana always adjusts the UTC time to the browser's timezone. In Kibana 3 this is optional but in Kibana 4 I don't think you can turn it off.

3 Likes

Thanks for reply!

But In Kibana, how can I calculate values (e.g daily count) using my local timezone? Because Kibana uses @timestamp to do aggregation which is UTC time.

It uses UTC but if you ask for $today then it will only show the 24 hours from $now to $now-24hr relative to your TZ, not UTC absolute.

Hi,

I have one more question about timestamp:

  1. Logstash generates indexes based on @timestamp with UTC time, if I want to get documents from
    2015-09-01 to 2015-09-02 with my timezone, I need to search indexes logstash-2015.08.31 and
    logstash-2015.09.01, if I can change the @timestamp to my timezone direct, I think I can directly
    search the index logstash-2015.09.01. Correct?
  2. I use the filter:
    grok {
    match => { "message" => "%{COMBINEDAPACHELOG}"}
    }
    can I parse the timestamp to add fields "year", "month", "day"?
    e.g.: 20/Aug/2015:07:06:25 +0800
    to : "year" => "2015" "month"=>"2015-08" "day" => "2015-08-20"

Thanks in advance!

1 Like
  1. Yes.
  2. Sure, you can have an additional grok filter that extracts those fields from the timestamp that's produced by the COMBINEDAPACHELOG pattern. But why would you want to do that? A range query against the @timestamp field is simple and fast.
  1. How can I change the @timestamp, from your first reply it looks like I can't change it...
  2. I want to calculate some fields count by daily, weekly, monthly. currently if I calculate 10 days count, I should use range query to run 10 times, I think if I have a day field to aggregate, it should be faster.
  1. Correct, it's not configurable and as we've explained you really shouldn't touch it either. However, you can trick Logstash by setting the timezone of the date filter to UTC, thereby disabling the timezone adjustment when parsing the date.
  2. Elasticsearch can do that for you. Just use a date histogram aggregation. But I guess Kibana currently isn't capable of passing the timezone parameter.

I can't trick the Logstash by set the timezone to "UTC", my message's timestamp is "[20/Aug/2015:21:06:25 +0800]", date filter is
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
timezone => "UTC"
}
what's wrong with it?

You probably need to omit the "Z" token in the pattern. Try this:

date {
  match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss +0800" ]
  timezone => "UTC"
}

If that works, consider removing "+0800" from the pattern and from the timestamp field.

4 Likes

"+0800" works fine, just remove it from the date pattern will cause error, I think remove it from timestamp field needs change the COMBINEDAPACHELOG pattern, it's a little bit complex.

Thanks a lot!

You can change your initial grok expression or you can delete the timezone token from the resulting timestamp field using the mutate filter's gsub parameter.

I use the "COMBINEDAPACHELOG" pattern, in the grok-patterns file, I see "COMBINEDAPACHELOG" uses "COMMONAPACHELOG" which include "HTTPDATE",
so I want to get "year" as a new field, my grok filter is:
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
add_field => {"year" => "%{YEAR}"}
}

but %{YEAR} doesn't work, how can I get the "year" field?

I suggest you add a second grok filter to extract the year from the newly extracted timestamp:

grok {
  match => ["timestamp", "^%{MONTHDAY}/%{MONTH}/%{YEAR:year}"]
}

It works, Thank you very much for your patience!

1 Like

A newbie to ELK stack. And met such issue also.

From a system admin/ user view, I should say this is a workaroud, and everything should goes perfect if the logstash just add the supporting to customize the timezone .

Why not add this simple feature? But specify such a complex workaround??

1 Like

Hi, magnusbaeck

if I just want to change the @timestamp field, what should I do? the following is my log format:
2016-01-15 09:33:23,650 INFO [127.0.0.1:57799 http-bio-8080-exec-10] com.ins.car.controller.CarOrderController.getInsResult(CarOrderController.java:62) - 2016-01-15 17:33:23,458 getInsResultServlet {....}

as you see, there is two times, 2016-01-15 09:33:23,650 and 2016-01-15 17:33:23,458, the first one is %{+yyyy-MM-dd HH:mm:ss,SSS}, the second one comes from the source message

@jarvan4dev—please start a new thread for your question.

For ref, Kibana 4.5 supports changing this in settings-->advanced.

Hello @magnusbaeck,
Thanks so much for your insights on handling the timstamp conversion in logstash. I tried your suggestions on tricking the logstash using date filter to UTC but it does not seem to work. My need is to have logstash write the output from DB2 to a flat file without changing the original timezone of the data. So, i want to avoid logstash (2.3.4) having to convert my date fields to UTC.

format of my raw data from db for field say RECENT_TS is 2016-09-12 14:08:13.355243 and is in MST timezone.

filter {
date {
match => [ "RECENT_TS", "ISO8601"]
locale => "en"
timezone => "UTC"
}
}
when i used this date filter and executed it, i got the following error
Failed parsing date from field {:field=>"RECENT_TS", :value=>"2016-09-12T17:43:12.148Z", :exception=>"cannot convert instance of class org.jruby.RubyObject to class java.lang.String", :config_parsers=>"ISO8601", :config_locale=>"en", :level=>:warn}

so I included a mutate to convert to string before doing a date filter like below

filter {
mutate {
convert => [ "RECENT_TS" , "string" ]
}
date {
match => [ "RECENT_TS", "ISO8601"]
locale => "en"
timezone => "UTC"
}
}

Now i do not have any errors but my original problem is still not solved

logstash output : "RECENT_TS":"2016-09-12T22:00:36.403Z"
DB input for RECENT_TS field : 2016-09-12 15:00:36.403977

I tried giving different canonical ID values in timezone param in date filter but doesnt seemt to reflect in logstash output - am i missing something here ? Your help is appreciated !