Kibana incorrect time shown

Hello,

I'm having a long fight with Kibana/Logstash (7.2.0 and 7.0.0) due to this "issue" that occurs with other people too.

After reading other topics/discussions i've done some corrections to my Logstash configuration file. As you can see below, i parse the timestamp field as ISO 8601 (*1) and i set the timezone to "Europe/Lisbon". I set this field because i read in a discussion that this can handle the Daylight Saving. So, in the time of the writing Portugal is UTC+1H.

(*1) My Log4J2 configuration file has the following: KeyValuePair key="timestamp" value="$${date:yyyy-MM-dd'T'HH:mm:ss.SSS'Z'}"

input {
	beats {
		port => "5044"
	}
}

filter {
  json {
    source => "message"
  }
  json {
    source => "message"
	skip_on_invalid_json => true
  }
  date {
	match => ["timestamp", "ISO8601"]
	timezone => "Europe/Lisbon"
  }
}

output {
	elasticsearch {
		hosts => [ "localhost:9200" ]
		index => "my-custom-index"
	}
}

Unfortunately i've always get my logs 1 hour ahead shown on timeline.
If i change settings on Management > Advanced settings > dateFormat:tz between "Portugal", "Europe/Lisbon", "Browser" or "UTC", i don't see any changes.

I also changed my laptop configurations for date & time to disable Daylight Saving, but with no success on Kibana.

I see in Kibana that the fetched data (log > Expanded document > JSON) is correct, so i'm assuming that this is a Kibana thing.

Am i missing something?

The Kibana timeline is controlled by the time field chosen when you define the index pattern to Kibana, usually @timestamp. In the expanded document, compare the time in "message" with @timestamp. Compare those fields to verify the date filter is working.

If not set, @timestamp defaults to ingest time.

I don't understand why you are calling the json filter twice with source => message, I think the second just overwrites what the first has done, but I don't think it's related to the time problem.

Hello Len,

Thank you for the reply!

I'm overriding the @timestamp and that was achieved easily a long ago. The timestamp in Kibana is equal to the one in my application console (in miliseconds), as the order of the logs.

The fact that i'm calling the JSON filter plugin twice is because i have a nested message field inside the first, so the Logstash will "chain" the filters and index all the json items to the event document. I avoided to use adicional adapters in Log4J2 and add data to MDC. It's awesome!

And i don't think too that it's related to the problem, as i already explained above, i see the correct timestamps.

Hi,

I finally managed to get this working!

The problem is that i was sending the wrong datetime data to ELK.

As i said in my first topic the format of my application is the following:

yyyy-MM-dd'T'HH:mm:ss.SSS'Z'

But this is wrong because it's the format of UTC (ISO8601).
From my application i need to send the information of my timezone (or offset).

So what i did was:

  1. Remove datetime field from my JSONLayout (log4j2)
  2. Use the UNIX timestamp field already attached by log4j2 that is called timeMillis
  3. In my logstash configuration file i changed the Date filter to the following:

date {
match => ["timeMillis", "UNIX_MS"]
timezone => "Europe/Lisbon"
}

This configuration means that i'm receaving a timestamp from Lisbon, so Logstash (or ES) will convert it to UTC applying Lisbon offset. In Kibana the event is correct now leaving dateFormat:tz as default.

Probably i have to do some changes if i'm sending events from different machines in another timezone.

I hope this help someone else.

Glad that you posted the solution as well. Hope it helps the community at large.

Thanki
Rashmi

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.