Logstash 8 @timestamp field format was changed to microseconds percision

Hi,
I have been using Logstash for a while now and when upgrading to version 8 I can see the @timestamp field format was changed from milliseconds percision to microseconds percision (meaning instead of 2022-07-28T09:46:06.200Z we are getting 2022-07-28T09:46:06.200000Z) .
I have many indices which map @timestamp field with milliseconds format and now i'm getting many indexing errors due to @timestamp field.

i'm able to workaround this (and change it to milliseconds percision) by overriding the field using:

mutate {
        add_field => {
            "tmptimestamp" => "%{@timestamp}"
        }
    }
    mutate {
        gsub => [
          "tmptimestamp", "\d{3}Z$", "Z"
        ]
    }
    date {
        match => ["tmptimestamp", "yyyy-MM-dd'T'HH:mm:ss.SSSZ"]
    }

However, this seems very ineffective and requires changing many pipelines.

Is there a way to set @timestamp format in order to get milliseconds percision in Logstash 8 as well or is there more effective solution for this?

Thanks,
Ofir

1 Like

What errors do you get and what do the mappings of the fields look like?

Hi @Badger

mapping is like the following (only showing @timestamp part):

{
  "mappings": {
    "properties": {
      "@timestamp": {
        "type": "date",
        "format": "yyyy-MM-dd'T'HH:mm:ss.SSSX"
      }
    }
  }
}

The error i'm getting (400 status):

{
  "error" : {
    "root_cause" : [
      {
        "type" : "mapper_parsing_exception",
        "reason" : "failed to parse field [@timestamp] of type [date] in document with id 'XXX'. Preview of field's value: '2022-07-28T08:09:38.000000Z'"
      }
    ],
    "type" : "mapper_parsing_exception",
    "reason" : "failed to parse field [@timestamp] of type [date] in document with id 'XXX'. Preview of field's value: '2022-07-28T08:09:38.000000Z'",
    "caused_by" : {
      "type" : "illegal_argument_exception",
      "reason" : "failed to parse date field [2022-07-28T08:09:38.000000Z] with format [yyyy-MM-dd'T'HH:mm:ss.SSSX]",
      "caused_by" : {
        "type" : "date_time_parse_exception",
        "reason" : "date_time_parse_exception: Text '2022-07-28T08:09:38.000000Z' could not be parsed at index 23"
      }
    }
  },
  "status" : 400
}
  • I guess originally we could have just used date and let Elastic infer the format but this is already applied for many indices and current mapping can not be changed without reindexing.

I do not run elasticsearch, so I cannot test this, but would updating the mapping on the existing index to be

"format": "yyyy-MM-dd'T'HH:mm:ss.SSSX||yyyy-MM-dd'T'HH:mm:ss.SSSSSSX"

work?

Unfortunately changing format is equivalent to try and change mapping on Elasticsearch and such attempt returns in conflict error:

{
  "error" : {
    "root_cause" : [
      {
        "type" : "illegal_argument_exception",
        "reason" : "Mapper for [@timestamp] conflicts with existing mapper:\n\tCannot update parameter [format] from [yyyy-MM-dd'T'HH:mm:ss.SSSX] to [yyyy-MM-dd'T'HH:mm:ss.SSSX || yyyy-MM-dd'T'HH:mm:ss.SSSSSSX]"
      }
    ],
    "type" : "illegal_argument_exception",
    "reason" : "Mapper for [@timestamp] conflicts with existing mapper:\n\tCannot update parameter [format] from [yyyy-MM-dd'T'HH:mm:ss.SSSX] to [yyyy-MM-dd'T'HH:mm:ss.SSSX || yyyy-MM-dd'T'HH:mm:ss.SSSSSSX]"
  },
  "status" : 400
}

Also I rather avoid (if possible) changing Elasticsearch mapping and find an effective solution on Logstash side.

On the logstash side the only thing I can think of would be the mutate+mutate+date that you showed.

Thanks, I might open feature request on Github

Opened enhancement request in case anyone wants to follow: Configuration for @timestamp field format · Issue #14399 · elastic/logstash · GitHub

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.