Default TimeZone

Hi,

I am getting issue in Timezone while Querying data on stored index.

Issue: We have Dates stored like "2021-10-04T11:11:27-07:00" on server "GMT" but while pulling Data using Aggr it gets converted to "2021-10-04T18:11:27-07:00" (Accessing from IST).

Please share your thought to Get data as is that is stored on index irrespective of where we are querying data.

Can you share your query?

Are you running the query from Kibana?

All dates are stored internally in Elasticsearch as GMT.

So there's also question, are you sure that you're ingesting the dates correctly?

If you just look at at the fields in a single document is the timestamp correct??

Data I have on elastic
image

Here is my query that resulting data by adding 7 hours into it.

          "query": {
            "range": {
              "body.version": {
                "gte": "now-12d/d",
                "lte": "now/d"
              }
            }
          },
          "aggs": {
            "file_delay_agg": {
              "multi_terms": {
                "terms": [
                  {
                    "field": "body.dimensions.idnName.keyword"
                  },
                  {
                    "field": "body.dimensions.clientKey"
                  }
                ]
              },
              "aggs": {
                "lastfiledate": {
                  "max": {
                    "field": "body.dates.extractionDateTime",
                    "format": "yyyy-MM-dd HH:mm:ss"
                  }
                }
              }
            }
          }

Timestamp for all documents is same as mentioned in the screenshot.

@M.Naim I forgot to welcome you to the community .. but Please don't / try not to post screen shots of text it makes it hard to help and debug and please format you code going forward with the </> button I did that for you.

Also you did not answer where you are running this query are you running it in Kibana?

To Be Clear All timestamp are stored in UTC in Elasticsearch time_zone can be applied on input and results (in some cases).

Kibana Discover automatically convert to local time zone of the browser, this can sometimes cause confusion.

Can you please run a simple query that will show both the _source (what is in your screen shot) and the actual field

Can you do this? In Kibana Dev Tools

You can do this please and show the results for one of the documents

GET your-index-name/_search
{
  "_source": ["body.dates.extractionDateTime"], 
  "fields": ["body.dates.extractionDateTime"]
}

The max aggregation will return the the max date in UTC unfortunately it does not appear that i the max aggregation will take a time_zone parameter unlike a date_histogram

POST discuss-time/_doc
{
  "mydate" : "2022-05-15T05:49:00-08:00"
}

POST discuss-time/_doc
{
  "mydate" : "2022-05-15T06:49:00-08:00"
}

POST discuss-time/_doc
{
  "mydate" : "2022-05-15T07:49:00-08:00"
}

GET discuss-time/_search
{
  "_source": ["mydate"], 
  "fields": ["*"]
}

Results notice the difference between the source field and the stored field which is in UTC

{
  "took" : 0,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 3,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ygc5yIAB5xx3K-gmvcWc",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T05:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T13:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ywc5yIAB5xx3K-gmvcW9",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T06:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T14:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "ZAc5yIAB5xx3K-gmvcXl",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T07:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T15:49:00.000Z"
          ]
        }
      }
    ]
  }
}

Now the Aggs... max will return in UTC

GET discuss-time/_search
{
  "_source": ["mydate"],
  "fields": ["*"],
  "query": {
    "match_all": {}
  },
  "aggs": {
    "lastfiledate": {
      "max": {
        "field": "mydate"
      }
    }
  }
}

Results

{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 3,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ygc5yIAB5xx3K-gmvcWc",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T05:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T13:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ywc5yIAB5xx3K-gmvcW9",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T06:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T14:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "ZAc5yIAB5xx3K-gmvcXl",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T07:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T15:49:00.000Z"
          ]
        }
      }
    ]
  },
  "aggregations" : {
    "lastfiledate" : {
      "value" : 1.65262974E12,
      "value_as_string" : "2022-05-15T15:49:00.000Z"
    }
  }
}

Now with your formatting ... its the same

GET discuss-time/_search
{
  "_source": ["mydate"],
  "fields": ["*"],
  "query": {
    "match_all": {}
  },
  "aggs": {
    "lastfiledate": {
      "max": {
        "field": "mydate",
        "format": "yyyy-MM-dd HH:mm:ss"
      }
    }
  }
}

REsults

{
  "took" : 7,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 3,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ygc5yIAB5xx3K-gmvcWc",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T05:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T13:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ywc5yIAB5xx3K-gmvcW9",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T06:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T14:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "ZAc5yIAB5xx3K-gmvcXl",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T07:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T15:49:00.000Z"
          ]
        }
      }
    ]
  },
  "aggregations" : {
    "lastfiledate" : {
      "value" : 1.65262974E12,
      "value_as_string" : "2022-05-15 15:49:00"
    }
  }
}
1 Like

With a date_histogram you can apply a time_zone but I do not think that is what you want... because you want the max

GET discuss-time/_search
{
  "_source": ["mydate"],
  "fields": ["*"],
  "query": {
    "match_all": {}
  },
  "aggs": {
    "lastfiledate": {
      "date_histogram": {
        "fixed_interval": "1h",
        "field": "mydate",
        "format": "yyyy-MM-dd HH:mm:ssZ",
        "time_zone": "America/Los_Angeles" <!--- or -0700
      }
    }
  }
}

result

{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 3,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ygc5yIAB5xx3K-gmvcWc",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T05:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T13:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "Ywc5yIAB5xx3K-gmvcW9",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T06:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T14:49:00.000Z"
          ]
        }
      },
      {
        "_index" : "discuss-time",
        "_type" : "_doc",
        "_id" : "ZAc5yIAB5xx3K-gmvcXl",
        "_score" : 1.0,
        "_source" : {
          "mydate" : "2022-05-15T07:49:00-08:00"
        },
        "fields" : {
          "mydate" : [
            "2022-05-15T15:49:00.000Z"
          ]
        }
      }
    ]
  },
  "aggregations" : {
    "lastfiledate" : {
      "buckets" : [
        {
          "key_as_string" : "2022-05-15 06:00:00-0700",
          "key" : 1652619600000,
          "doc_count" : 1
        },
        {
          "key_as_string" : "2022-05-15 07:00:00-0700",
          "key" : 1652623200000,
          "doc_count" : 1
        },
        {
          "key_as_string" : "2022-05-15 08:00:00-0700",
          "key" : 1652626800000,
          "doc_count" : 1
        }
      ]
    }
  }
}

Hi stephenb,

I am using Kibana Dev tools,

Here is my query:

POST MyIndex-2022-04-01/_search
{
  "query": {
    "match": {
      "body.dates.extractionDateTime": "2021-09-26T09:08:29-07:00"
    }
  },
  "aggs": {
    "file_delay_agg": {
      "multi_terms": {
        "terms": [
          {
            "field": "body.dimensions.idnName.keyword"
          },
          {
            "field": "body.dimensions.clientKey"
          }
        ]
      },
      "aggs": {
        "lastfiledate": {
          "max": {
            "field": "body.dates.extractionDateTime",
            "format": "yyyy-MM-dd HH:mm:ss"
          }
        }
      }
    }
  },
  "size": 1
} 

In Result Part Please consider ExtractionDatetime

{
  "took" : 2,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 1,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "MyIndex-2022-04-01",
        "_type" : "_doc",
        "_id" : "MDeMwaTNXLenX9DUir4g6GwAAAAAAAAA",
        "_score" : 1.0,
        "_source" : {
          "ReceivedDate" : "2022-05-06T11:46:06.2832412Z",
          "metricId" : "EL-DATA-LATENCY",
          "PersistedDate" : "2022-05-06T11:46:07.7772732Z",
          "tenantId" : "770",
          "threshold" : {
            "minimum-admin-file-process-duration" : 5.0,
            "minimum-admin-file-delay" : 24.0,
            "rep-to-linking-batch-interval" : 2.0,
            "batch-high-count" : 50.0,
            "minimum-es-file-delay" : 24.0,
            "minimum-es-file-process-duration" : 5.0
          },
          "body" : {
            "data" : {
              "processDurationSecondsFileMover" : 0.0,
              "processDurationSecondsLinking" : 0.0,
              "processDurationSecondsElasticProcessing" : 0.0,
              "latencySecondsMedRepToLinking" : 0.0,
              "processDurationSecondsFileCreation" : 0.0,
              "latencySecondsLinkingToFileCreation" : 0.0,
              "overallLatencySeconds" : 0.0,
              "processDurationSecondsMedRepository" : 86.0,
              "latencySecondsMedImportToMedRep" : 488.0,
              "latencySecondsFileMoverToMedImport" : 41.0,
              "processDurationSecondsFileUpload" : 0.0,
              "processDurationSecondsMedImport" : 2.0,
              "latencySecondsFileUploadToElasticProcessing" : 0.0,
              "latencySecondsFileCreationToUpload" : 0.0
            },
            "dates" : {
              "fileUploadStartDateTime" : "0001-01-01T00:00:00+00:00",
              "elasticFileCreationEndDateTime" : "0001-01-01T00:00:00+00:00",
              "repositoryEndDateTime" : "2022-04-23T18:55:47-07:00",
              "importStartDateTime" : "2022-04-23T18:46:11-07:00",
              "repositoryStartDateTime" : "2022-04-23T18:54:21-07:00",
              "fileMoverEndDateTime" : "2022-04-23T18:45:30-07:00",
              "fileUploadProcessCreatedDateTime" : "0001-01-01T00:00:00+00:00",
              "elasticProcessingEndDateTime" : "0001-01-01T00:00:00+00:00",
              "extractionDateTime" : "2021-09-26T09:08:29-07:00",
              "elasticFileProcessCreatedDateTime" : "0001-01-01T00:00:00+00:00",
              "linkingCreatedDateTime" : "0001-01-01T00:00:00+00:00",
              "linkingProcessingEndDateTime" : "0001-01-01T00:00:00+00:00",
              "elasticProcessingCreatedDateTime" : "0001-01-01T00:00:00+00:00",
              "fileMoverStartDateTime" : "2022-04-23T18:45:30-07:00",
              "dateStamp" : "2022-04-23T00:00:00-07:00",
              "elasticProcessingStartDateTime" : "0001-01-01T00:00:00+00:00",
              "repositoryCreatedDateTime" : "0001-01-01T00:00:00+00:00",
              "elasticFileCreationStartDateTime" : "0001-01-01T00:00:00+00:00",
              "importEndDateTime" : "2022-04-23T18:46:13-07:00",
              "fileUploadEndDateTime" : "0001-01-01T00:00:00+00:00",
              "linkingProcessingStartDateTime" : "0001-01-01T00:00:00+00:00"
            },
            "version" : "2022-05-06T04:45:51-07:00",
            "dimensions" : {
              "schema" : "1.0.0",
              "batchKey" : "3055",
              "clientKey" : 770.0,
              "sourceSystem" : "EMR System",
              "sourceSystemKey" : "5",
              "metricRecordId" : "02d150d7-d2d6-4ef2-ba11-8389700b5b30",
              "timeZone" : "Pacific Standard Time",
              "facilityName" : "UVA Full Extract - 20200121 ",
              "idnName" : "Diversion 1.7 Baseline IDN - Infusion UVA",
              "batchClientKey" : "6601",
              "idnKey" : 93.0
            }
          }
        }
      }
    ]
  },
  "aggregations" : {
    "file_delay_agg" : {
      "doc_count_error_upper_bound" : 0,
      "sum_other_doc_count" : 0,
      "buckets" : [
        {
          "key" : [
            "Diversion 1.7 Baseline IDN - Infusion UVA",
            770.0
          ],
          "key_as_string" : "Diversion 1.7 Baseline IDN - Infusion UVA|770.0",
          "doc_count" : 1,
          "lastfiledate" : {
            "value" : 1.632672509E12,
            "value_as_string" : "2021-09-26 16:08:29"
          }
        }
      ]
    }
  }
}

Hi @M.Naim

Thanks for the data.

However .You did not provide the actual request I asked for.

I wanted to check something can you please show this

GET your-index-name/_search
{
  "size" : 2
  "_source": ["body.dates.extractionDateTime"], 
  "fields": ["body.dates.extractionDateTime"]
}

But in the end, what I already said is happening... The max aggregation you have at the bottom will return the the max date in UTC unfortunately it does not appear that i the max aggregation will take a time_zone parameter unlike a date_histogram and it does not appear to be a simple way to convert that

Also just put the Z at the end of your date format and you will see the date is in UTC

 "aggs": {
    "lastfiledate": {
      "max": {
        "field": "body.dates.extractionDateTime",
        "format": "yyyy-MM-dd HH:mm:ssZ" <!--- Here
      }
    }
  }
}

Hi Stephen

Here is result of ask:

{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 10,
    "successful" : 10,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 1659,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "my-index-2021-04-01",
        "_type" : "_doc",
        "_id" : "YZTApVjuaF0Cr8-Nkm-tk4MAAAAAAAAA",
        "_score" : 1.0,
        "_source" : {
          "body" : {
            "dates" : {
              "extractionDateTime" : "2020-01-22T03:00:17-08:00"
            }
          }
        },
        "fields" : {
          "body.dates.extractionDateTime" : [
            "2020-01-22T11:00:17.000Z"
          ]
        }
      },
      {
        "_index" : "my-index-2021-07-01",
        "_type" : "_doc",
        "_id" : "MMXhJGA1UjcZyZczo8PaK5IAAAAAAAAA",
        "_score" : 1.0,
        "_source" : {
          "body" : {
            "dates" : {
              "extractionDateTime" : "2020-04-21T11:04:14-07:00"
            }
          }
        },
        "fields" : {
          "body.dates.extractionDateTime" : [
            "2020-04-21T18:04:14.000Z"
          ]
        }
      }
    ]
  }
}

I think you miss understood the question or I might not able to explain you. Result wich i sent you with max Aggr has time diffrence of 7 hours i want that should be as per data available b/c we are fetching only one record.

to avoid the confusion/Complexity of Max Aggr, I have created runtimemapping field to show the difference in both values.

GET my-index-name/_search
{
  "size": 1, 
"runtime_mappings": {
    "extractionDateTime": {
      "type": "keyword",
      "script": { 
        "source": "emit(doc['body.dates.extractionDateTime'].value.toString('yyyy-MM-dd HH:mm:ss'));"
      }
    }
  }, 
  "fields": [ 
    "extractionDateTime"
  ]
}

Result :

#! Use of the joda time method [toString(String)] is deprecated. Use [a DateTimeFormatter] instead.
{
  "took" : 7,
  "timed_out" : false,
  "_shards" : {
    "total" : 10,
    "successful" : 10,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 1659,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "my-index-name-2021-04-01",
        "_type" : "_doc",
        "_id" : "YZTApVjuaF0Cr8-Nkm-tk4MAAAAAAAAA",
        "_score" : 1.0,
        "_source" : { 
          "PersistedDate" : "2022-05-06T11:46:36.5986693Z",
          "tenantId" : "770",
          "threshold" : {
            "minimum-admin-file-process-duration" : 5.0,
            "minimum-admin-file-delay" : 24.0
          },
          "body" : {
            "data" : {  "overallLatencySeconds" : 0.0
            },
            "dates" : {
              "fileUploadStartDateTime" : "0001-01-01T00:00:00+00:00",
              "elasticFileCreationEndDateTime" : "0001-01-01T00:00:00+00:00",
              "extractionDateTime" : "2020-01-22T03:00:17-08:00"
            },
            "version" : "2022-05-06T04:45:51-07:00",
            "dimensions" : {
              "schema" : "1.0.0",
              "batchKey" : "1519",
              "timeZone" : "Pacific Standard Time",
              "batchClientKey" : "5070" 
            }
          }
        },
        "fields" : {
          "extractionDateTime" : [
            "2020-01-22 11:00:17"
          ]
        }
      }
    ]
  }
}

Looking forward to fixing this conversion gap, better if can set Timezone at watcher level irrespective of which timezone being used to trigger it.

The max aggregation is always going to return in UTC.

If you want that to be different, you're going to have to do some post-processing or script or something, I don't have a simple solution for that.

1 Like

No worries, I have done the conversion in PST.

Thanks.