Date parsing errors I need help to debug

I am getting this error in my logstash error logs:

logstash[24891]: [2020-06-04T12:27:22,478][WARN ][logstash.outputs.elasticsearch][main][5a0b38d56348d6bc3ed40ef3de0fd75852bd567bf4973ad3b30fa5a1538d879b] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"var_log_app1-2020.06.04", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x4bdcd0b9>], :response=>{"index"=>{"_index"=>"var_log_app1-2020.06.04", "_type"=>"_doc", "_id"=>"PXLefnIBkgu1dtKt_KDZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [log_timestamp] of type [date] in document with id 'PXLefnIBkgu1dtKt_KDZ'. Preview of field's value: '20200604 122220'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [20200604 122220] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

This is the date field in the filter file:

  date {
    match => [ "log_timestamp", "yyyyMMdd HHmmss", "yyyy-MM-dd'T'HH:mm:ss" ]
    target => "@timestamp"
    timezone => "Europe/Berlin"
    add_field => { "debug" => "timestampMatched" }
  }

"log_timestamp" => "20200604 122218",

JSON output:

Jun 04 12:27:22 mon-01 logstash[24891]:               "ecs" => {
Jun 04 12:27:22 mon-01 logstash[24891]:         "version" => "1.5.0"
Jun 04 12:27:22 mon-01 logstash[24891]:     },
Jun 04 12:27:22 mon-01 logstash[24891]:          "@version" => "1"
Jun 04 12:27:22 emon-01 logstash[24891]: }
Jun 04 12:27:22 mon-01 logstash[24891]: {
Jun 04 12:27:22 mon-01 logstash[24891]:             "agent" => {
Jun 04 12:27:22 mon-01 logstash[24891]:              "version" => "7.7.0",
Jun 04 12:27:22 mon-01 logstash[24891]:                   "id" => "d5360ac6-52a9-408c-b439-dc6f5e8d7ae2",
Jun 04 12:27:22 mon-01 logstash[24891]:                 "type" => "filebeat",
Jun 04 12:27:22 mon-01 logstash[24891]:         "ephemeral_id" => "b7491df8-ad12-4a28-8597-34126129d3d8",
Jun 04 12:27:22 mon-01 logstash[24891]:                 "name" => "0domain-0-filebeat",
Jun 04 12:27:22 mon-01 logstash[24891]:             "hostname" => "0domain-0"
Jun 04 12:27:22 mon-01 logstash[24891]:     },
Jun 04 12:27:22 mon-01 logstash[24891]:               "log" => {
Jun 04 12:27:22 mon-01 logstash[24891]:         "offset" => 29640140,
Jun 04 12:27:22 mon-01 logstash[24891]:           "file" => {
Jun 04 12:27:22 mon-01 logstash[24891]:             "path" => "/var/log/app1/shop.log"
Jun 04 12:27:22 mon-01 logstash[24891]:         }
Jun 04 12:27:22 mon-01 logstash[24891]:     },
Jun 04 12:27:22 mon-01 logstash[24891]:             "debug" => "timestampMatched",
Jun 04 12:27:22 mon-01 logstash[24891]:        "@timestamp" => 2020-06-04T10:22:19.000Z,

I am using the stack version 7.7.0

Can someone help?

ES is trying to parse your date fields with the format [strict_date_optional_time||epoch_millis] (which i think is the default), and it fails since your log format doesn’t match it. you need to update your mapping in ES to accept that format or update your template (if you’re using index template)

In the json output the timestamp is displayed like this:

"@timestamp": "2020-06-04T13:12:31.789Z"
How can I strip off the .789Z at the end?

In the rubydebug output the value of @timestamp is not surround by quotes, so we know it is a Logstash::Timestamp object. If you really want that to be a string then use mutate to convert and modify it.

mutate { convert => { "@timestamp" => "string" } }
muate { gsub => [ "@timestamp", "\.\d{3}$", "" ] }

However, I suspect you will not be happy with the result.

just want to highlight that this particular error is caused by [log_timestamp] field, and not the @timestamp field.

this output shows that @timestamp is generated properly

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.