Invalid date because milliseconds does not contains trailing zeros


TL;DR;

Valid: 2019-07-10T17:01:01.500Z
Valid: 2019-07-10T17:01:01.50Z
INVALID: 2019-07-10T17:01:01.5Z

I want to know how to accept the invalid value as a valid value, since it is a valid date value.


I'm trying to add a document that contains a date property, if this property has a time with a rounded milliseconds that results in only one digit, it causes the following error:

{
  "error": {
    "root_cause": [
      {
        "type": "mapper_parsing_exception",
        "reason": "failed to parse field [value_at] of type [date] in document with id '1'"
      }
    ],
    "type": "mapper_parsing_exception",
    "reason": "failed to parse field [value_at] of type [date] in document with id '1'",
    "caused_by": {
      "type": "illegal_argument_exception",
      "reason": "failed to parse date field [2019-07-10T17:01:01.5Z] with format [strict_date_optional_time||epoch_millis]",
      "caused_by": {
        "type": "date_time_parse_exception",
        "reason": "Failed to parse with all enclosed parsers"
      }
    }
  },
  "status": 400
}

Steps to reproduce:

PUT my_index/_doc/1
{
  "value_at": "2019-07-10T17:01:01.500Z"
}

PUT my_index/_doc/1
{
  "value_at": "2019-07-10T17:01:01.5Z"
}

Mapping generated by the first PUT:

{
  "my_index" : {
    "aliases" : { },
    "mappings" : {
      "properties" : {
        "value_at" : {
          "type" : "date"
        }
      }
    },
    "settings" : {
      "index" : {
        "creation_date" : "1562792878369",
        "number_of_shards" : "1",
        "number_of_replicas" : "1",
        "uuid" : "gXOZiC0PTFuQKaOwdeoyig",
        "version" : {
          "created" : "7000199"
        },
        "provided_name" : "my_index"
      }
    }
  }
}

Is it possible to get some help here??? :thinking:

I have no idea. Which version is that?

Hey @dadoonet, thanks for the interest in the topic.

Both ElasticSearch and Kibana are runing 7.0.1 version.

Could you try with 7.2 in case something has been fixed in the meantime?

1 Like

@Thiago_Coimbra_Lemos

Hi just ran this all in 7.2.0 all ran fine...

PUT my_index/_doc/1
{
  "value_at": "2019-07-10T17:01:01.500Z"
}

GET my_index
   {
      "my_index" : {
        "aliases" : { },
        "mappings" : {
          "properties" : {
            "value_at" : {
              "type" : "date"
            }
          }
        },
        "settings" : {
          "index" : {
            "creation_date" : "1563048120913",
            "number_of_shards" : "1",
            "number_of_replicas" : "1",
            "uuid" : "2WuVHjLyTXSHx2fzeQDRCg",
            "version" : {
              "created" : "7020099"
            },
            "provided_name" : "my_index"
          }
        }
      }
    }

PUT my_index/_doc/2
{
  "value_at": "2019-07-10T17:01:01.50Z"
}


PUT my_index/_doc/3
{
  "value_at": "2019-07-10T17:01:01.5Z"
}

GET my_index/_search
{
  "took" : 0,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 3,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "my_index",
        "_type" : "_doc",
        "_id" : "1",
        "_score" : 1.0,
        "_source" : {
          "value_at" : "2019-07-10T17:01:01.500Z"
        }
      },
      {
        "_index" : "my_index",
        "_type" : "_doc",
        "_id" : "2",
        "_score" : 1.0,
        "_source" : {
          "value_at" : "2019-07-10T17:01:01.50Z"
        }
      },
      {
        "_index" : "my_index",
        "_type" : "_doc",
        "_id" : "3",
        "_score" : 1.0,
        "_source" : {
          "value_at" : "2019-07-10T17:01:01.5Z"
        }
      }
    ]
  }
}

If you are tied to that version you could probably use logstash and some logic to parse / fix before sending to Elasticsearch

1 Like

Just confirmed that this is working in 7.2.0.

We will update to this version.

Thanks @stephenb and @dadoonet.

1 Like