Transforms: date epoch_millis parsing errors

Hello,

We have a 7.6.2 elasticsearch cluster.

A transform is failing with the next parsing errors:

    261]: index [t_transform_testing], type [_doc], id [QWFGhHsjq4vzL5KDKy_E9J2TAAAAAAAA], message [MapperParsingException[failed to parse field [lastError] of type [date] in document with id 'QWFGhHsjq4vzL5KDKy_E9J2TAAAAAAAA'. Preview of field's value: '1.588590892488E12']; nested: IllegalArgumentException[failed to parse date field [1.588590892488E12] with format [strict_date_optional_time||epoch_millis]]; nested: NotSerializableExceptionWrapper[date_time_parse_exception: Failed to parse with all enclosed parsers];]

Seems that it doesn't accept "1.588590892488E12" as an epoch_millis date. The strange thing is that this field is obtained by a "toInstant().toEpochMilli()" function, used in this aggregation:

    "lastError": {
            "scripted_metric": {
              "init_script": "state.timestamps = []",
              "map_script": "if (doc.error.value == true) { state.timestamps.add(doc.timestamp.value.toInstant().toEpochMilli()) }",
              "combine_script": "return state.timestamps.length > 0 ? Collections.min(state.timestamps) : -1L",
              "reduce_script": "double first = 0; for (a in states) { if(!(a == -1L) && (a < first || first == 0)) { first = a } } return first"
            }
          }

Is it a bug or I'm missing something?

Thanks in advance.

This is a mapping error, if you try to put a document with 1.588590892488E12 as value of a date field you get the same error.

Any reason you use double (a floating point number)? If you change your script to use long I am sure it will work.

1 Like

You are totally right @Hendrik_Muhs ! I was too focused on the toEpochMilli() function. Changing the double for a long it's working.

Thank you very much.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.