Unable to convert date to specified timezone

Hi All,

I am trying to convert the date UTC time to different timezone (Asia/Tokyo) in logstash and below is the configuration I have tried,

input {
  stdin {
    id => "logstash-rae"
    add_field => {
        "log_origin_rae" => "live_logs_rae"
    }
  }
}

filter {

  mutate { add_field => { "[my_date_utc]" => "2023-01-03 16:00:00.034" } }
  date {
    match => ["my_date_utc", "yyyy-MM-dd HH:mm:ss.SSS"]
    timezone => "Asia/Tokyo"
    target => "my_date_jst"
  }
  date {
    match => ["my_date_utc", "yyyy-MM-dd HH:mm:ss.SSS"]
    timezone => "Europe/Paris"
    target => "my_date_europe"
  }
}
output {
  stdout {
    codec => rubydebug {
      metadata => true
    }
  }
}

Below is the output in logstash console,

{
        "@timestamp" => 2023-01-03T16:42:25.064Z,
       "my_date_jst" => 2023-01-03T07:00:00.034Z,
    "log_origin_rae" => "live_logs_rae",
              "host" => "logstash",
          "@version" => "1",
       "my_date_utc" => "2023-01-03 16:00:00.034",
    "my_date_europe" => 2023-01-03T15:00:00.034Z,
           "message" => ""
}

My server timezone is UTC and I expected the below output

{
        "@timestamp" => 2023-01-03T16:42:25.064Z,
       "my_date_jst" => 2023-01-04T01:00:00.034Z,
    "log_origin_rae" => "live_logs_rae",
              "host" => "logstash",
          "@version" => "1",
       "my_date_utc" => "2023-01-03 16:00:00.034",
    "my_date_europe" => 2023-01-03T17:00:00.034Z,
           "message" => ""
}

Can you please help me to understand why the date values are getting subtracted based on the timezone?
Is this the behaviour of logstash date filter?

Please correct me If I am doing anything wrong.

Thanks,
Ganeshbabu R

[my_date_utc] is a string (it has double quotes around it), [my_date_jst] is a LogStash::Timestamp object (it does not have quotes around it), so it is always in UTC (that's what the trailing Z tells you). The timezone option on the date filter tells it that [my_date_utc] is actually in Asia/Tokyo timezone, so it knows to move it back by 9 hours to get to UTC.

Basically the names of your fields are the wrong way around.

1 Like

Thanks for the clarification @Badger

I am facing an issue in @timestamp -- where I expect the @timestamp value will be my local time instead of that I am getting different value,

below is my logstash configuration i am using to send messages to kafka

input {
 file {
     add_field => { "[@metadata][type]" => "response_log" }
     path => "/usr/local/tomcat/logs/test.log"
     start_position => "beginning"
 }
}
filter {
 if [@metadata][type] == "response_log" {
        csv {
          columns => [datetime,request_ip,domain,host,method,uri,response_code,response_time,client_id,payload,headers,exception_type,exception_message,api_response_code,api_response_time,user_id,api_type,extra_field]
          separator => "        "
          quote_char => "       "
          convert => {
            "response_time" => "integer"
            "api_response_time" => "integer"
            "api_type" => "integer"
          }
        }
        ruby {
          init => 'require "date"'
          code => 'event.set("processing_time", event.get("response_time").to_i - event.get("api_response_time").to_i); event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'
        }
        mutate {
         convert => {
          "ts" => "integer"
          "processing_time" => "integer"
         }
         remove_field => [
           "path",
           "extra_field"
         ]
      }
   }
}
Output {
 <message sent to kafka topic>
}

sample logs,

2023-01-05 15:27:39.868 601.86.226.128 app.test.co stg-7dcbcb5965-drm GET /api/Travel/VacantRoomCalendar/20140716 401 0 API_REQUEST_ID=efbc5f2c7-4a6b-b66b-f68bc21bab63 x-forwarded-for=10.80.23.72 OAuthError specify valid access token 0 0 0 1

Once the logstash send message kafka and my another logstash configuration is running to read the kafka topic message and push it to elasticsearch and in this configuration, I am converting the datetime field to UTC and set it to target field @datetime

input {
 <messages are read from kafka topic>
}
filter {
    if [@metadata][kafka][topic] == "test" {
    date {
      timezone => "UTC"
      target => "@datetime"
      match => [ "datetime", "ISO8601" ]
    }
 }
}
output {
  <message index to elasticsearch>
}

but what happened was when I check the index in elasticsearch I found that @timestamp value is subtracted -9 hrs from the @datetime field values and stored as like this

"@timestamp": "2023-01-05T06:27:39.000Z"

and below is the @datetime value stored in the same index,

"@datetime": "2023-01-05T15:27:39.868Z"

json stored in elasticsearch index

  "_source": {
    "method": "GET",
    "ts": 1672900059,
    "exception_type": "OAuthError",
    "api_type": 1,
    "request_ip": "601.86.226.128",
    "@datetime": "2023-01-05T15:27:39.868Z",
    "uri": "/api/Travel/VacantRoomCalendar/20140716",
    "api_response_code": "0",
    "@timestamp": "2023-01-05T06:27:39.000Z",
    "processing_time": 0,
    "payload": "API_REQUEST_ID=efbc5f2c--f68bc21bab63",
    "user_id": "0",
    "host": "stg-7dcbcb5965-drm4k",
    "datetime": "2023-01-05 15:27:39.868",
    "domain": "app.test.com",
    "api_response_time": 0,
    "@version": "1",
    "response_code": "401",
    "response_time": 0,
    "client_id": null,
    "exception_message": "specify valid access token",
    "headers": "x-forwarded-for=10.80.23.72"
  }

Can you please help me to understand why @timestamp value indexing like this, I expect the @timestamp value to be my local timezone value.(I mean the current time in the server)

at the same time if I removed this condition in rube code and process it, @timestamp value stored it as the current time

event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'

Kindly share your thoughts.

Please correct me If I am doing anything wrong.

Thanks,
Ganeshbabu R

Not going to happen. [@timestamp] is always UTC.

3 Likes

Thanks @Badger

Yes I also thought the same @timestamp value will be stored as UTC because my logstash server time is in UTC, but in the filter conditions I haven't set any target field for @timestamp.

But not sure how the logstash subtract -9 hrs from the value of @datetime for @timestamp field.

at the same time if I removed this condition in rube code and process it, @timestamp value stored as current UTC value in the server and doesn't subtract -9hrs from the @datetime field.

event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'

Kindly correct me if my understanding is wrong.

Thanks
Ganeshbabu R

In which timezone is the value of your original date string in the datetime field?

The date filter will convert a date string into a Logstash::Timestamp object, which is always in UTC, in the filter bellow you are telling logstash that the datetime field is already in UTC, is this correct?

    date {
      timezone => "UTC"
      target => "@datetime"
      match => [ "datetime", "ISO8601" ]
    }

The timezone option in the date filter enables you to tell the date filter that your date string field is in a different timezone from UTC so it will add or subtract the correct offset, this is the timezone of the source date string, not the timezone you want to convert to as the output of the date filter will always be in UTC.

1 Like

Thanks @leandrojmp

The value for the datetime field is from Japan timezone (JST)

Yes right i am converting a date string to logstash timestamp object in the date filter.

Is this the reason for @timestamp value subtracted -9 hrs ?

Kindly share your thoughts.

Thanks,
Ganeshbabu R

If the value of the datetime field is in JST, but the date string does not have any information about the timezone, then your first date filter is wrong.

It needs to be:

date {
    match => ["datetime", "ISO8601"
    timezone => "Asia/Tokyo"
    target => "@datetime"
}

This will convert your datetime date string to UTC, apply the offset from UTC, and store it in the @datetime field.

I would say that this is probably the cause, you were using the wrong timezone in your first date filter.

1 Like

Thanks @leandrojmp

When I changed the datefilter timezone from UTC to Asia/Tokyo and I understood that @datetime field value will be UTC and its subtract -9 hrs from the value of datetime and also @timestamp value stored in UTC.

date {
    match => ["datetime", "ISO8601"]
    timezone => "Asia/Tokyo"
    target => "@datetime"
}

JSON stored in elasticsearch

  "_source": {
    "method": "GET",
    "exception_type": "OAuthError",
    "ts": 1673014299,
    "request_ip": "601.86.226.",
    "api_type": 1,
    "@datetime": "2023-01-06T14:11:39.868Z",
    "uri": "/api/Travel/VacantRoomCalendar/20140716",
    "api_response_code": "0",
    "@timestamp": "2023-01-06T14:11:39.000Z",
    "processing_time": 0,
    "payload": "API_REQUEST_ID=efbc5f2c-bba7-4a6b-f68bc21bab63",
    "user_id": "0",
    "host": "prestg-7dcbcb5965-drm4k",
    "datetime": "2023-01-06 23:11:39.868",
    "domain": "stg-jp.app.co.jp",
    "api_response_time": 0,
    "response_code": "401",
    "@version": "1",
    "response_time": 0,
    "client_id": null,
    "exception_message": "specify valid access token",
    "headers": "x-forwarded-for=10.80.94.78"
  }

after changing the date filter with timezone "Asia/Tokyo", I expected the changes only in @datetime field but @timestamp value is also storing the same value,

"@datetime": "2023-01-06T14:11:39.868Z",

"@timestamp": "2023-01-06T14:11:39.000Z",

Can you please share your thoughts on this?

Is the @timestamp value is correct? this is how the @timestamp value will be stored when using date filter with timezone bcs the reason I am asking is when I am processing the message my server UTC time is little ahead of the above @timestamp. I thought my server UTC time will be stored as @timestamp.


I also did testing by commenting out this code in the filter section,

event.set("ts", DateTime.strptime(event.get("datetime") + "+09:00","%F %T.%L%z").strftime("%s"))'

when I process the message, @timestamp value stored it as the current time from my server.

"@timestamp": "2023-01-07T04:47:05.059Z",

"@datetime": "2023-01-07T04:46:00.868Z",

below is the json stored in elasticsearch

  "_source": {
    "method": "GET",
    "exception_type": "OAuthError",
    "api_type": 1,
    "request_ip": "891.86.226.128",
    "@datetime": "2023-01-07T04:46:00.868Z",
    "uri": "/api/Travel/VacantRoomCalendar/20140716",
    "api_response_code": "0",
    "@timestamp": "2023-01-07T04:47:05.059Z",
    "processing_time": 0,
    "payload": "API_REQUEST_ID=efbc5f2c-bba7--f68bc21bab63",
    "user_id": "0",
    "host": "prestg-7dcbcb5965-drm4k",
    "datetime": "2023-01-07 13:46:00.868",
    "domain": "stg-jp.app.co.jp",
    "api_response_time": 0,
    "@version": "1",
    "response_code": "401",
    "response_time": 0,
    "client_id": null,
    "exception_message": "specify valid access token",
    "headers": "x-forwarded-for=10.80.94.78"
  }

I am unable to understand how the Datetime strptime function affecting the @timestamp value in the first json where both @datetime & @timestamp value is same.

Kindly let me know your thoughts it would be helpful.

Thanks,
Ganeshbabu R

I'm a little lost now on what is your issue and what you want to achieve.

Can you provide more context about what is the issue and what you want to achieve?

The filter below will convert your field datetime, which is in the timezone Asia/Tokyo to UTC.

date {
    match => ["datetime", "ISO8601"]
    timezone => "Asia/Tokyo"
    target => "@datetime"
}

This filter only affects the @datetime field.

There is no relation between the values of @datetime and @timestamp, the @datetime is created by the date filter parsing your original datetime field, the @timestamp field is created by logstash when the event enters the pipeline, if you are consuming data in real time, those two fields may have a similar time.

Normally people want the @timestamp field to have the value of the original time of the event and use a date filter with the default target to store this value in the @timestamp field.

Is this what want? If so, you need to also have the following date filter.

date {
    match => ["datetime", "ISO8601"]
    timezone => "Asia/Tokyo"
}

If you do not have a date filter with the default target, the @timestamp field is created by logstash.

@leandrojmp

Yes you are right.. there is no relation between @datetime and @timestamp field..

there was a filter issue in my logstash configuration which is causing the problem. we have figured out and fixed it.

sorry for the confusion and thanks for your explanation :slight_smile:

Thanks,
Ganeshbabu R

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.