How to map timestamp with nano seconds?

Hie ,
I have a time where it is of the format : 2022- 03-28T16:51:11.637003013Z

I tried to map like this :

 date{

match => ["Pretime","yyyy-MM-dd'T'HH:mm:ss.SSSSSSSSSZ"]

target => "@timestamp"

remove_field => "Pretime"
add_tag => [ "match" ]
}

But im getting an error like this

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"copla_iot-gateway", :routing=>nil}, {"host"=>{"name"=>"7eeaa1655525"}, "LOG"=>""\u003c6\u003e 2022-03-28 16:51:11.636 +00:00 [INF] - Entering periodic task to reauthenticate connected clients\n"", "stream"=>""stdout"", "@timestamp"=>2022-03-28T16:51:12.018Z, "ecs"=>{"version"=>"1.12.0"}, "@version"=>"1", "Pretime"=>""2022-03-28T16:51:11.637003013Z"", "type"=>"iot_gateway", "file_path"=>"/var/lib/docker/containers/baf1688dab843d87c5ade68bc116a3d55be636e0e7afd08b4e3081ee05307e90/baf1688dab843d87c5ade68bc116a3d55be636e0e7afd08b4e3081ee05307e90-json.log", "tags"=>["beats_input_codec_plain_applied", "match", "_rubyexception", "_dateparsefailure"], "agent"=>{"id"=>"c9329a30-3871-4aa0-b246-8ce3ca374002", "ephemeral_id"=>"deb8b058-9418-4f69-8dd0-d7170ec71b07", "hostname"=>"7eeaa1655525", "version"=>"7.17.0", "name"=>"7eeaa1655525", "type"=>"filebeat"}, "port"=>41144, "input"=>{"type"=>"log"}, "message"=>"{"log":"\u003c6\u003e 2022-03-28 16:51:11.636 +00:00 [INF] - Entering periodic task to reauthenticate connected clients\n","stream":"stdout","time":"2022-03-28T16:51:11.637003013Z"}", "fields"=>{"logsource"=>"iot_gateway"}}], :response=>{"index"=>{"_index"=>"copla_iot-gateway-2022.03.08-1", "_type"=>"_doc", "_id"=>"94dw0X8Bf49xDEz_x4eA", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Pretime] of type [date] in document with id '94dw0X8Bf49xDEz_x4eA'. Preview of field's value: '"2022-03-28T16:51:11.637003013Z"'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field ["2022-03-28T16:51:11.637003013Z"] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}

Please post the complete error message including the first "reason" message, not just the second.

added the complete error @Badger

OK, so the Pretime field in Elasticsearch is of type date and the default parser for date fields cannot parse anything more than millisecond precision. But that does not matter, since if your date filter worked the [Pretime] field would be removed and so not indexed.

Your log message includes

"Pretime"=>""2022-03-28T16:51:11.637003013Z""

Note that there are two sets of double quotes around the timestamp. I suspect that the field value is really

"Pretime"=>"\"2022-03-28T16:51:11.637003013Z\""

The pattern has to match the entire field. I cannot create a pattern that matches the field value with quotes, so instead I suggest removing the quotes

mutate { gsub => [ "Pretime", '"', "" ] }

Your existing pattern then works and produces

"@timestamp" => 2022-03-28T16:51:11.637Z,

Although the logstash Timestamp class supports nanosecond precision the date filter does not. If you want a string parsed with more than millisecond precision then the trick is to use mutate+add_field to create a JSON field and then parse it using a json filter rather than a date filter. See this SO thread for an example.

1 Like

@Badger That worked , Thanks a million for the perfect explanation. :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.