Copy another field with nanoseconds to elasticsearch/kibana @timestamp field

Hello,

My Elasticsearch is version 7.5
I have
a> date type - @timestamp field
b> string type - event_timestamp field (format: 2020-03-06 20:28:33:232123456 +0000) sent in each event.

What I want to achieve:
1st, I want to copy @timestamp field to another field called ls_timestamp (ls_timestamp should have same format as @timestamp after copying)
2nd, I want to copy event_timestamp field value to @timestamp (I want to see all the nanoseconds from event_timestamp in @timestamp after copying)
3rd, I want to delete event_timestamp field.

Please let me know how I can achieve this. Thanks.

Have you mapped the field as date_nanos in Elasticsearch as described in this blog post?

Hi @Christian,
I have set the date field format to nanoseconds.

For e.g:
Initial values for:
@timestamp: Mar 9, 2020 @ 00:58:55.528000000
event_timestamp: 2020-03-09 04:58:54.497305256 +0000

After 1st, 2nd and 3rd above are applied, I see
@timestamp: Mar 9, 2020 @ 00:58:54.497000000
ls_timestamp: 2020-03-09T04:58:55.528Z

The issue I see is @timestamp is not showing all the nanoseconds as in event_timestamp.
and ls_timestamp is showing in UTC even though my original @timestamp is in EST.

Filter used:
mutate
{
add_field => ["ls_timestamp", "%{@timestamp}"]
}
date {
match => ["event_timestamp" ,"yyyy-MM-dd HH:mm:ss.SSSSSSSSS Z"]
target => "@timestamp"
timezone => "UTC"
}

Elasticsearch requires all timestamps to be stored in UTC and Kibana and other utilities rely on this.

In order for Elasticsearch to be able to handle and interpret nanoesecond precision timestamps you do need the special mapping I linked to. Can you please show us your index template?

As this is a reasonably new feature in Elasticsearch (7.x) I am not sure whether the Logstash date filter has been updated to support this yet or not. It used to only be able to handle the standard millisecond precision if I recall correctly. Maybe someone from the Logstash team can shed some light on this?

Here is my index template:
{
"order": 2,
"index_patterns": [
"mydata"
],
"settings": {},
"mappings": {
"properties": {
"event_timestamp": {
"type": "date_nanos"
},
"@timestamp": {
"type": "date_nanos"
},
"ls_timestamp": {
"type": "date_nanos"
}
}
},
"aliases": {}
}

Seeing this messages in LS:
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test-mydata-newfeature-weekly-2020.03-11-2", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x7e53f105], :response=>{"index"=>{"_index"=>"test-mydata-newfeature-weekly-2020.03-11-2", "_type"=>"_doc", "_id"=>"EdBSwHABTddalsd96xyO", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [event_timestamp] of type [date_nanos] in document with id 'EdBSwHABTddalsd96xyO'. Preview of field's value: '2020-03-09 17:23:05.224925379 +0000'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2020-03-09 17:23:05.224925379 +0000] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

I have been playing around with this. This is my latest updated index template:

{
"order": 2,
"index_patterns": [
"mydata"
],
"settings": {},
"mappings": {
"properties": {
"ls_timestamp": {
"type": "date_nanos"
},
"@timestamp": {
"type": "date_nanos"
},
"event_timestamp": {
"format": "yyyy-MM-dd HH:mm:ss.SSSSSSSSS Z",
"type": "date_nanos"
}
}
},
"aliases": {}
}

I don't see above "could not index event" message anymore. I can see data in kibana. But still, the nanoseconds are getting chopped off after 3 decimal places after copying event_timestamp to @timestamp. And event_timestamp is still coming as string.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.