Varying data types in object causing mapping errors

I have a data object called "weekly_values" that I want to send to an existing elastic index. This object contains fields which when populated have a float data type. However these fields do not always have a value and when they don't I set them to a default of '0'. I use dynamic mapping, and when I first sent this object to elastic the very first time it encountered some of these fields they had a value of '0' so elastic mapped them as a long. Then the first time it encountered that same field with data, which was a float, it fails to map (see attached). As a fix, I set the default value to 0.0 instead of 0 thinking this would set the field as a float whether it has data or not. But I discovered that elastic rounds 0.0 to 0 so it still treats it as a long. I tried using the update mapping API to explicitly set these field's data types in advance but there are just too many (about 300). I'd like to continue using dynamic mapping, how can I prevent this mapping error? Thanks!

Did you send the data to a new index? It won't work if you are still using the same index as the mapping cannot be changed.

You need to use a new index.

I don't think this is correct, if Elasticsearch receives a numeric value of 0.0 it will map the field as float.

You can test with the following requests:

POST discuss-float-example/_doc/
{
  "float_value": 0.0,
}

And using GET discuss-float-example/_mapping will return the following:

{
  "discuss-float-example": {
    "mappings": {
      "properties": {
        "float_value": {
          "type": "float"
        }
      }
    }
  }
}

Thanks for the reply. You're right 0.0 does map as a float, I think my issue was actually related to some bad data. It's been resolved.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.