Cannot determine timezone from nil logstash

I'm getting an error while running logstash

" (ArgumentError) Cannot determine timezone from nil\n(secs:1700041898.446,utc~:"2023-11-15 09:51:38.4460000991821289",ltz~:nil)"

I have tried solutions from other threads (adding jdbc_default_timezone => "Asia/Riyadh" to logstash .conf file or adding TZ="Asia/Riyadh" to elasticsearch.yaml file) but still didnt work for me .

The only solution to get around this was to change the timezone of the server itself to UTC (this was only temporary till i have better understanding of the ELK stack). Is there another solution to this?

ELK stack version 8.10.2

Welcome to the community!
The field has the value: "2023-11-15 09:51:38.4460000991821289", how did you try conversion? Pls copy the date {} code.

I am surprised that setting jdbc_default_timezone did not fix it, but setting TZ="Asia/Riyadh" would need to be done in the shell script that invokes logstash (and you might need export TZ="Asia/Riyadh". Definitely not in elasticsearch.yaml which logstash does not use.

Thank you, here is my pipeline configuration :

input {
    jdbc {
	 clean_run => true
        jdbc_driver_library => "E:\ELK 8.10.2\logstash-conf\ojdbc6.jar"
        jdbc_driver_class => "oracle.jdbc.driver.OracleDriver"
        jdbc_connection_string => "*****"
        jdbc_user => "****"
		jdbc_password => "****"
		jdbc_default_timezone => "Asia/Riyadh"
        #schedule => "*/5 * * * *"
        #use_column_value => true
		#tracking_column => "CALLS"
		tags => ["oraclelogger"]
filter {
date {
      match => [ "CALL_STRAT_DATE_TIME", "yyyy/MM/dd HH:mm:ss" ] 
      timezone => "Asia/Riyadh"
      target=> "@timestamp" 
    elasticsearch {	
        hosts => ["http://****:9200/"]
		index => "testlogger_index"
		user => "elastic"
		password => "****"
		ssl => false
		ssl_certificate_verification => false

I have also tried converting @timestamp itself to a new field (through filter) but it didnt work, tried different format also (UNIX/ISO) still not getting any results.
Tried selecting time column from DB with TO_CHAR function and converted to the format 'yyyy/MM/dd HH:mm:ss', tried converting UTC

    FROM_TZ(Call_Start_Time, 'Asia/Riyadh') AT TIME ZONE 'UTC' AS TIMESTAMP
  ) AS Call_Start_Time_UTC

This is getting really frustrating tbh :frowning:

Me too, everyone is complaining about data being stored with time offset due to different timezones, while its not even working for me.
Also i would like to thank you for your replies to other threads, it helped me alot :slight_smile:

Ah sorry, Badger has right and point to the right direction. There is a bug.

Since you are on Win, you can use: tzutil

Or you can try with Pshell:

  1. Get-TimeZone
  2. Get-TimeZone -ListAvailable
  3. Set-TimeZone -Name “Arab Standard Time” for Kuwait and Riyadh UTC+3

Since I hadn't this issue, if still not work , ELK is more Linux env, maaaybe try to set environment system variable TZ.

Timezone is already set to "(UTC+03:00) Kuwait, Riyadh",
I have tried setting ENV variable TZ with value Asia/Riyadh (correct me if it's wrong). Still same issue.
I think the problem is that logstash is trying to extract the default @timestamp UTC time from server timezone since it works perfectly fine when the server timezone is UTC.

Not sure where is an issue.

Since you have set jdbc_default_timezone, last try is to set plugin_timezone.

Badger if plugin_timezone dosen't help, what do you thing, will log.level: trace lead to somethere?

Thanks alot @Rios, I have recreated the TZ variable and restarted the server and it worked. Later I changed the timezone from Kibana settings (on browser) and now everything is working fine. Really appreciate this :smiley:

1 Like

Thank you for feedback, really appreciate. It's important for the community.

ELK is nice, you have to understand how things works. Someone will help you even s/he didn't have similar problem. Elastic :smiling_face_with_three_hearts:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.