Date Mapping with Time Zone


#1

In mapping a date field in an index, is it possible for Elasticsearch to treat a field containing data like this as a date field?

2015-07-14 18:42:26 America/Toronto
2015-07-14 22:42:26 UTC

I could not find an option for a time zone in the docs (https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html). I am using a standard Joda Time Zones, though (http://joda-time.sourceforge.net/timezones.html).

If the time zone cannot be taken into account, then can the remaining date (such as only 2015-07-14 18:42:26) still be salvaged without the time zone?

And I also have a few fields that represent are in a unix epoch date format:
1436913746

I want to be able to map these in Kibana and plot data according to these dates, but I cannot seem to get the correct format in my index mappings. Is this possible with the built in mappings, or do I need to go for a custom mapping? (The docs seem to suggest that time zone's are not parsed and all date formats are treated as UTC.)


(Colin Goodheart-Smithe) #2

If you provide a valid Joda date format (see this excerpt from Joda's DateTimeFormat) then Elasticsearch should be able to parse it. So time zone information can be included in the date format using z or Z. If you need to parse multiple date formats you can separate different formats with || and Elasticsearch will try all the formats until it finds one that works.

For the fields that have unix epoch date formats you will need to explicitly map these fields as dates in your mappings otherwise Elasticsearch will index them as longs. The default date format should also parse unix epoch timestamps correctly.

Lastly it is best to explicitly map all these fields as dates rather than trying to rely on the default dynamic mappings. you can set your own rule for dynamically mapping fields by using index templates, but if you can I would explicitly map each date field.

Hope that helps


#3

Thanks for the fast reply. I will try a custom date format in joda's date format.


#4

I was able to fix my previous problem and parse the time formats I provided. Thanks again! But I still ran into a problem trying to use unix epoch time (surprisingly).

"mappings": {
"_default_": {
		"properties": {
			"unixstamp ": {
				"type": "date"
			}
}
}
}

Using this, Kibana seems to not convert timestamps appropriately. The timestamp 1436913746 should convert to a date in July of this year, but Kibana has it listed as somewhere close to the beginning of the epoch, 1970. Do I need an additional setting to map unix time fields correctly? The default of

 “format”: “dateOptionalTime”

is applied to the unixstamp field when I check my index mappings (as expected from the docs). Is there some way I can force it to recognize the unix epoch timestamp as what it is to perhaps bypass this problem?


(Colin Goodheart-Smithe) #5

I think the problem is that Elasticsearch is looking for the epoch timestamp in milliseconds rather than seconds (it appears from your example that your epoch time is in seconds, sorry if I missed that before). You will either need to convert the seconds values to millis before you index the document in Elasticsearch or you could use the factor parameter on the date_histogram aggregation to convert them at aggregation time: https://www.elastic.co/guide/en/elasticsearch/reference/current/search-facets-date-histogram-facet.html#_factor


(Colin Goodheart-Smithe) #6

Sorry, that link is actually for facets instead of aggregations (my mistake) so it looks like you may need to convert at index time.


(system) #7