Timezone problem/question again

In past I had problem with datestamp field but resolve by using this

same index
read data from source -> logstash -> send Field1 to ELK as is and it will be converted to UTC and convert it back to local time via Kibana

read data from source -> logstash -> create new field1_timezone=UTC -> send to ELK and it will be saved as is.

in above case I can use field1 for kibana, and Field1_timezone for sql query. and All works

now reading data via python and this logic is not working.

how do I save datetime that I get from source to ELK without change and it will convert to UTC by default.? it is moving day out

for example
{'count': 130, 'status_achieved': datetime.datetime(2020, 11, 3, 0, 0), 'status_achieved_timezone': datetime.datetime(2020, 11, 3, 0, 0, tzinfo=<StaticTzInfo 'Etc/UTC'>), 'timestamp': datetime.datetime(2020, 12, 9, 19, 16, 17, 280173), '_id': 'xyz1234567'}

in above 'status_achieved" is same as I got from source.
status_achieved_timezone is converted to UTC

This record should show up on date 11/03/2020 but on kibana it shows up on 11/02/2020

This how the record shows up in discovery

    "status_achieved": "2020-11-03T00:00:**00-05:51**",
    "status_achieved_timezone": "2020-11-03T00:00:00+00:00",
     "count": 130,
    "timestamp": "2020-11-19T13:16:44.085568",
  "fields": {
    "status_achieved": [
    "status_achieved_timezone": [
    "timestamp": [

This how I am converting time in python

   datetime_obj_utc = mytime_timestamp.replace(tzinfo=timezone('Etc/UTC'))
   datetime_obj_cst = mytime_timestamp.replace(tzinfo=timezone('America/Chicago'))

this is fixed. used following

   datetime_obj_cst = pytz.timezone('America/Chicago').localize(mytime_timestamp)
   datetime_obj_utc = pytz.timezone('Etc/UTC').localize(mytime_timestamp)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.