Scripted field HourOfDay()

Hi,
I want a field containing the hour of the day for each elasticsearch document.
The easiest way would be to parse @timestamp into a hour_of_day field but that means I would have to reindex all my data.

I saw another solution by using Scripted Fields with:
doc["@timestamp"].getHourOfDay()

That works to get the hour ('HH') in format 00-23 and that's exactly what I was expecting. But, the hour_of_day field isn't correct as it shows a different hour value (probably due to timezone differences..idk)
So I have:
@timestamp : April 30th 2017, 09:00:00.000
hourOfDay : 7
datetime : 2017-04-30 09:00:00 +02

The datetime field shows the deviation and it's different throughout my dataset. So in the example above, instead of getting 9 I'm getting 7.
Is this a Kibana timezone configuration step?

Thanks.

Timezones in Kibana are a little weird, especially with scripted fields.

All dates in Elasticsearch are UTC (+0). Kibana does some data formatting, using your local browser's timezone, so you end up in situations exactly like you're describing.

You can override the timezone in Management > Advanced Settings, using the dateFormat:tz setting. Here, you can just select a timezone, whatever matches your data, and that should give you results more like you would expect.

Thanks for the feedback, Joe!

Indeed scripted fields are a bit weird so I decided to go for the logstash config fix.
I've adjusted the dateFormat:tz to my timezone and the logstash config as:

date {
  locale => "en"
  timezone => "UTC"
  match => ["datetime",
                  "YYYY-MM-dd HH:mm:ss Z",
                  "YYYY-MM-dd HH:mm:ss",
                  "ISO8601"]
}
mutate {
  add_field => {"hourOfDay" => "%{+HH}" }
}

I'm still having the "wrong" hour value from hourOfDay.

@timestamp: May 30th 2017, 02:00:00.000
hourOfDay: 00

The only way to get the correct hourOfDay is by removing the timezone(Z) value from my dataset. But that sounds incorrect as timezones should be something transparent.

Any suggestion ?

So after testing a bit more, I got this to work by matching all possible timezone variations in my dataset. Also, everything should stay UTC (as life should be..) and let the presentation handle it.

date {
  timezone => "UTC"
  match => ["datetime",
                  "YYYY-MM-dd HH:mm:ss +02",
                  "YYYY-MM-dd HH:mm:ss +01"]
}

Scripted field: doc["@timestamp"].date.hourOfDay

1 Like

I got the same confusion these days.

It's not the best way to modify logstash's date filter to change the default timezone of @timestamp which be sent to elasticsearch. Because the @timestamp in UTC format which store in elasticsearch is exactly correct.

I've tried the same method and finally I found out that kibana can transform the @timestamp field into local timezone format, but when you use the scripted field: doc["@timestamp"].date.hourOfDay, it turns out incorrect.

Let's see what happened, every doc in Discover panel of kibana, you will find out that the @timestamp field (June 8th 2017, 09:50:01.000) in Table tab is different from the one in JSON tab ("@timestamp": "2017-06-08T01:50:01.000Z"), that's because kibana has already transform the @timestamp field (JSON tab) into correct local timezone format(Table tab) automatically, and the JSON one is in UTC format, which is also correct (for machine), but the number of hour is not what we want.

So the problem is NOT caused by the @timestamp field, it is caused by the method we used to get the hourOfDay.

Most of us have read the document Lucene Expressions referenced in kibana. It told us to use

doc["@timestamp"].date.hourOfDay

in scripted filed to get the hour of day in this official document, but it did not tell us this method: hourOfDay is calculated based on @timestamp which is in UTC time format (the one in JSON tab). More the worse is that there is NO other method else to change the timezone in Lucene Expressions!

After knowing this truth, I tried to find methods using Painless Expressions. Finally, I got this blog: Using painless kibana scripted fields. According to the article, I used the following expression in scripted field with painless expressions language to figured out the problem:

LocalDateTime.ofInstant(Instant.ofEpochMilli(doc['@timestamp'].value), ZoneId.of('Asia/Shanghai')).getHour()

PS: No date filter needed.

Hopes this will help.

3 Likes

When i start to use painless expressions the problem is solved but now i always see a message error: request timeout after 30000ms. @kenxz do you have the same problem? i've only changed the time zone.

No.
I've no idea whether scripted field consumes lots of performance or not. But I've set "Elasticsearch.requestTimeout" to "60000 " in kibana.yml before I met this problem.

perfect, thank you very much @kenxz

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.