Stuck with timezone issue in logstash JDBC input

I am a bit lost here.

Some transaction happens in Sydney. And the record is stored in SQL database. The time stored is in UTC.

5 mins later, we pull that data using Logstash JDBC input. Kibana pulls up and shows the latest data.

And we end up with a discovery panel where the events are shown trailing by 11 hours.

Our end customers are wondering how something which happened now is shown as to have happened 11 hrs ago on discovery panel.

This is sample of the data I get from the database: 2020-06-03 21:19:41.783

I thought of leaving the original field as it is and adding a new field. And then using that field as the source of time when creating the index pattern. This is what I set in Logstash in Ruby filter.

		code => "event.set('localdateexp', event.get('createdon').time.localtime.strftime('%Y-%m-%dT%H:%M:%S.%3N%z'))"

But I did not create a new index pattern since in the discovey panel the new field localdateexp looked the same as createdon field.

I do not want to change the Kibana timezone setting from Browser to UTC. There are other indices which get data direct from applications rather than from a database.

Any ideas?

I'm a bit confused right now. But let's try to tidy up the chaos in my head: Something happened at 12:37 local time and was imported at 13:00, so 02:37 and 03:00 UTC because Sydney is UTC+10. It's now showing up as 02:37 local time instead which would mean that at some point in your pipeline UTC was interpreted as Sydney time and the JSON representation of your event with the UTC dates says that createdon is 16:37 the previous day while @timestamp is 03:37 today? Is that right?

That is what I am thinking is happening.
"UTC was interpreted as Sydney time": I think so since I am not touching the field. Should I be doing something in logstash?

To test I introduced that Ruby filter and tried to create a new field from the value of createdon field. But no luck yet.

I have read answers and most of them are saying that it is better to leave things in UTC as curator, kibana etc expect UTC only. But this messes up discovery panel.

You may be able to fix this by setting the jdbc_default_timezone option on the jdbc input. Alternatively, configure the jdbc_connection_string to tell it what timezone to use.

Hi @Badger I will be using that setting and updating the thread with results soon.

Meanwhile, I have realized that my logstash on linux box is of version 7.7.0 while the ES instance in cloud is 7.7.1. Maybe that might be the reason.

Otherwise I see that the createdon field is different for events which happened within an hour difference.

"createdon": "2020-06-12T03:46:27.467Z",

"createdon": "2020-06-11T17:36:49.950Z",

In database explorer I exported the results as csv and opened in notpad++ to get the real data.

The format it is coming in is:
2020-06-12 03:46:27.467
2020-06-12 03:36:49.950

There is no trailing Z at the end of the timestamps. Is this something I can look into to process explicitly in Elasticsearch?
I can use logstash filters to add zone at the end of the timestamp though I am a bit cagey about it and DST in particular.

Sorry these are not the exact events whose timestamp I took out. But I think it is enough to show the difference.

If you need further help to find the right timezone settings, it might be helpful to set up a pipeline that only consists of your JDBC input and a rubydebug output and post the results. The Z won't be necessary if Logstash is told which timezone to expect. Your target should be a correct Logstash Timestamp object (that uses UTC), not a specific string format for the date.

@Badger This settings solved my issue. Many thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.