Insert epoch time as "date" field, without converting to ISO8601 format

Hello,

I have an epoch time in the data that I am trying to parse via Logstash that I am trying to insert in a field in ES that represents the type "date". However, if i try to insert it just as it is, ES will assume the type of my epoch time to be a "long".

I have managed to fix the issue by changing the mapping of the field of the time in ES by inserting the following:

"time" : {
    "type" : "date",
    "format": "strict_date_optional_time||epoch_millis"
}

which solves my problem by letting me save epoch times with the type "date". However, I am not pleased by this as I would like to have it where Logstash automatically could make it so that the epoch time is recognized as a "date".

Here are a couple of methods that I have tried which have not worked:

The following simply changed my epoch time to the ISO8601, aka the readable format:

date {
	match => ["[meta][time]", "UNIX_MS"]
	target => "[meta][time]"
}

This inserted my epoch time as a "long" and not a "date":

ruby { 
	code => "
		event.set('[meta][time]', event.get('[meta][time]').to_i)
	" 
}

Here I tried to create a temporary field (timeTemp) to hold the value of my epoch time, change the other field to a date with the date filter and copy the value from the temp field to my new date field, but again, this will just insert a long.

# Create the temp field
mutate {
	add_field => { "[meta][timeTemp]" => "%{[meta][time]}" } 
} 

# Create the date field
date {
	match => ["[meta][time]", "UNIX_MS"]
	target => "[meta][time]"
}

# Copy the epoch time to the date field
mutate {
	copy => { "[meta][timeTemp]" => "[meta][time]" }
}

# Delete the temp field
mutate {
	remove_field => ["[meta][timeTemp]"]
}

I have also looked at the convert filter, but do not think that converting into a date is supported by Logstash as this gave me an error while running. Here is what I tried:

mutate { 
	convert => ["[meta][time]","date"]
 }

I am now asking for some advice on how I should proceed with this. I basically want to be able to insert my epoch time (example: 1656925998200, this is epoch time in MS) into ES as the type "date", without converting it into any other formats and without having to manually have to create a mapping for it.

convert supports conversions to integer, float, integer_eu, float_eu, string, and boolean. It cannot convert something to a date. That is what the date filter is for.

Your mutate/date/mutate/mutate appears to be a no-op. If saves the value of [meta][time], parses it, then overwrites it with the original value.

If there is no mapping defined, then dynamic mapping in elasticsearch will use date detection to decide whether a field is a date. By default that makes anything that matches

[ "strict_date_optional_time","yyyy/MM/dd HH:mm:ss Z||yyyy/MM/dd Z"]

into a date field on the document. If a field in logstash is a date then it will be sent to elasticsearch in a format that matches strict_date_optional_time (yyyy-MM-dd'T'HH:mm:ss.SSSZ), so elasticsearch will treat it as a date.

If you want the field to be a number in logstash, but a date in elasticsearch the only solution I can think of is to modify the dynamic_date_formats property of the index mapping to include epoch_millis. However, that still involves creating a mapping and will probably have significant unintended consequences for other numeric fields.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.