UTC+2 been added to the Time field in kibana

I have a Timestamp field in my log lines as following :

2017-04-03 07:21:39,092

In grok pattern I get the field "data"(timestamp_iso8601 type) as following :

 grok{
	 match => {"message" => "%{TIMESTAMP_ISO8601:Data} }	
     } 
mutate{	
	gsub => [
		   "Data"," ","T"	
		]	
}

When I created the index in kibana I mapped the field "data" to the field "time".
in Kibana I have the following output :

"time" : April 3rd 2017, 09:21:39.092
"data" : April 3rd 2017, 09:21:39.092
"@timestamp" : April 18th 2017, 14:41:32.042
"message": 2017-04-03 07:21:39,092

My current configured time zone in my operating system is a UTC+2.
I configured kibana to have the dateformat:tz to Africa/Maputo(my current timezone)
, but the problem persist.

so I changed the dateformat:tz to ETC/UTC . it solved the problem, but don't seems to be the best way to configure it.
Is there any way to keep the dateformat:tz with the correct country ?

What's the timezone for the original timestamp in the message field? Right now it includes no timezone information so Elasticsearch will assume it is in UTC. When Kibana gets the date in UTC it adds two hours to account for your local timezone. If you want message, data, and time to all match you'll need to add the timezone to the message field before indexing your documents into elasticsearch.

How can I add timezone?
My grok is exactly as I said.

Personally I would try to fix it in the application that's generating the logs so that the time is always correct no matter how you end up using the logs. However you could also use the Logstash date filter to parse the date and add the appropriate timezone.

Thank you . For your support.

Hey Bargs!
Sorry for this late late feedback. I was trying to solve this problem but still no luck .

here is what I have done :

grok{
match => {"message" => "%{TIMESTAMP_ISO8601:Data} }
}
mutate{
gsub => [
"Data"," ","T"
]
}
date {
match => [ "Data", "ISO8601" ]
timezone => "Africa/Maputo"
}

and in My kibana advanced configuration I have the browser timezone.

But still, elasticsearch is indexing the "data" field with plus 2 hours.

The date in my log are with the correct date and time (Africa/Maputo timezone).
Can't understand what's going on here .

Let's ensure elasticsearch has the correct datetime, then we'll look at Kibana.

Search your index pattern and pull back the time field you're interested in as a docvalue_field. It'll look something like this:

GET <index-pattern>/_search 
{
  "query": {
    "match_all": {}
  },
  "docvalue_fields": ["time"]
}

For each hit in the response you'll see a fields key with the time field and its value. Compare this value to the value you see in _source in the same document.

What does the _source value look like, and what does the doc_value look like? If the timezone was set correctly, the doc_value should be 2 hours behind the _source value because the doc_value is in UTC whereas the _source value is in Africa/Maputo.

If this looks correct, we can move on to Kibana and see what's happening there.

here is the output (I had to remove some data).

"_source" : {
          "@version" : "1",
          "beat" : {
            "hostname" : "",
            "name" : "",
            "version" : "5.2.2"
          },
          "host" : "",
          "ResponseTime" : 16,
          "Gecko" : "Gecko/20100101",
          "geoip" : {
            "latitude" : -23.865,
            "location" : [
              35.3833,
              -23.865
            ],
            "longitude" : 35.3833
          },
          "offset" : 563,
          "input_type" : "log",
          "Data" : "2017-04-26T11:51:39,108",
          "message" : "2017-04-26 11:51:39,108 [data removed]",
          "tags" : [
            "beats_input_codec_plain_applied"
          ],
          "@timestamp" : "2017-04-26T09:51:39.108Z",
          "translation" : "{\"geoip\": {\"latitude\": -23.865, \"longitude\": 35.3833, \"location\": [35.3833, -23.865]}}",
          "BrowserVersion" : "Firefox/23.0",
          "Application" : "01",
          "BrowserProvider" : "Mozilla/5.0"
        }
  },
"fields" : {
  "Data" : [
1493207499108
  ]
}

Thats the output using the configuration with Date plugin, as following :

grok{
match => {"message" => "%{TIMESTAMP_ISO8601:Data} }
}
mutate{
gsub => [
"Data"," ","T"
]
}
date {
match => [ "Data", "ISO8601" ]
timezone => "Africa/Maputo"
target => "@timestamp"
}

I tried to change the timestamp field with the target=>"@timestamp"

Ok, that mostly looks correct to me. Data is in Africa/Maputo timezone which is UTC+02:00. @timestamp is in UTC so it is two hours behind Data. The only remaining problem is that the raw Data field still doesn't specify its timezone, so elasticsearch is going to assume it's UTC.

I would just add an additional date filter to your logstash config to convert the Data field itself e.g.

date {
  match => [ "Data", "ISO8601" ]
  timezone => "Africa/Maputo"
  target => "Data"
}

I think that'll work.

1 Like

Thank you Bargs.

Working as expected now.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.