How to ignore properties in mapping

Thanks in advance.
I'm retrieving my records from mongo, then using Node to insert into ES thru API.

here is a small part of my mapping

{
"price_list_tier": {
"dynamic": "static",
"properties": {
"sku": { "type": "string"},
"amount": { "type": "double"}
}
}
}

I have the following I'd like to insert (just a partial listing)
price_list_tier: {
sku: "lk38s38d38ck",
amount: 35.57,
upcharge: [ {"size": "XL", "amount": 1}, {"size": "XXL", "amount": 1.25 }]

  1. Node: when node sees the 1, it treats this as an integer. I then get a mapping error, trying to insert a long into a double. What's the best way to handle this?

  2. I want to ignore completely the upcharge property, even if included in the object I send up for indexing. No matter what I have tried, I continue to get the mapping error trying to insert a long into double (upcharge.amount). Is it possible to skip the mapping of this property? I have tried:
    a) using dynamic: static, and not defining upcharge.
    b) upcharge: { "indexed": no, "enabled": no }
    c) upcharge: { "type": "object", "dynamic": "static" }

I'm using Compose Elasticsearch 2.4.0

Hi @bwillis614,

ad 1: you need to handle this on the client side and convert the value explicitly to a double.

ad 2. You need to exclude this field from _source (see docs). Here is a complete example that should get you started:

DELETE /store

PUT /store
{
   "mappings": {
      "product": {
         "_source": {
             "excludes": ["upcharge"]
         },          
          "properties": {
            "product": {
               "type": "string",
               "index": "not_analyzed"
            },
            "amount": {
               "type": "double"
            },
            "upcharge": {
               "type": "object",
               "enabled": false,
               "include_in_all": false
            }
         }
      }
   }
}


PUT /store/product/1
{
    "product": "p1",
    "amount": 12.34,
    "upcharge": [ {"size": "XL", "amount": 1}, {"size": "XXL", "amount": 1.25 }]
}

GET /store/product/1

PUT /store/product/2
{
    "product": "p2",
    "amount": 22.34
}

GET /store/product/2

Daniel

1 Like

Thank you so much.

I don't understand. I put the following in my mapping:

"price_list_tier": {
"_source": {
"excludes": ["upcharge"]
},
"properties": {
"account_id": {
"type": "string"
},
"upcharges": {
"type": "object",
"enabled": false,
"include_in_all": false
},

and now on upload I get:
[ { type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'illegal_argument_exception',
reason: 'mapper [upcharges.amount] of different type, current_type
[double], merged_type [long]' } },
{ type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'illegal_argument_exception',
reason: 'mapper [upcharges.amount] of different type, current_type
[double], merged_type [long]' } },

I was hoping that elasticSearch would completely ignore the field?
Is there a way to do that?

I thought by leaving the field out and putting this in that it would
completely ignore the field.
"price_list_tier": {
"dynamic": "static",
"properties": {
"dynamic": "static",

Hi @bwillis614,

seems you have still documents in your index. If these are just test documents, just delete the index, recreate it with the new mapping and index your documents again. If you're in production, create a new index and use the reindex plugin to reindex your data.

Daniel

Even if the data was bad (I think Mongo/Node are feeding me ints for whole
numbers), is there a way to ingore these input fields that I don't want to
index, even if they are wrong expected type?

I have dropped the index between runs. there's nothing there. it's my
data.

"upcharges" : [
{
"size" : "XXL",
"amount" : NumberInt(0)
},
{
"size" : "2XL",
"amount" : NumberInt(0)
},
{
"size" : "3XL",
"amount" : NumberInt(0)
},
{
"size" : "4XL",
"amount" : 2.5
},
{
"size" : "5XL",
"amount" : 2.5
}
],

I was hoping since I didn't want to index, it would just drop those fields
off, not even look at them, process rest of document?
It hasn't happened that way for me yet.
I'm using compose.elasticsearch.

Database Elasticsearch 2.4.0
Location us-east-1 on
aws

there's a bug in some other code that we haven't deployed yet, to force
whole numbers to be doubles in mongo.
but for now, I wanted to make this work. elasticsearch accepting bad data,
ignoring those fields because they won't be indexed.

ok, my last email. My latest attempt didn't work on disparate data either.

"price_list_tier": {
"_source": {
"excludes": [
"upcharges",
"integrations.lemonstand.price_tiers"
]
},
"upcharges": {
"type": "object",
"enabled": false,
"indexed": "no",
"include_in_all": false
},

Hi @bwillis614,

there seems to be something wrong with your mapping or your data... . Can you provide a small, self-contained example (see my example above) that reproduces the problem you experience?

Daniel

Daniel, I attached two files, both in json format. I hope that is good
enough?
price_list_tier.json is my model.
dat_to_insert.json is my one row of data, with disparate array data types
in the field I'm hoping to completely ignore.

If that doesn't work for you, let me know and I'll comply. I haven't done
any direct posting, only api work thru node driver.

Hi @bwillis614,

seems Discuss scrubbed the attachment. Can you provide them as gist or something similar?

Daniel

I'll try. hopefully I get this right.

DELETE /store

PUT /store
{
   "mappings": {
      "price_list_tier": {
         "_source": {
            "excludes": [
               "upcharges",
               "integrations.lemonstand.price_tiers"
            ]
         },
         "properties": {
            "account_id": {
               "type": "string"
            },
            "amount": {
               "type": "double"
            },
            "created_on": {
               "type": "date",
               "format": "strict_date_optional_time||epoch_millis"
            },
            "integrations": {
               "properties": {
                  "lemonstand": {
                     "properties": {
                        "last_sync": {
                           "type": "date",
                           "format": "strict_date_optional_time||epoch_millis"
                        },
                        "shop_customer_group_id": {
                           "type": "string"
                        },
                        "price_tiers": {
                           "type": "object",
                           "enabled": false,
                           "index": "no",
                           "include_in_all": false
                        }
                     }
                  }
               }
            },
            "max_qty": {
               "type": "long"
            },
            "min_qty": {
               "type": "long"
            },
            "name": {
               "type": "string"
            },
            "price_list_id": {
               "type": "string"
            },
            "product_id": {
               "type": "string"
            },
            "product_name": {
               "type": "string"
            },
            "product_number": {
               "type": "string"
            },
            "sku": {
               "type": "string"
            },
            "type": {
               "type": "string"
            },
            "upcharges": {
               "type": "object",
               "enabled": false,
               "index": "no",
               "include_in_all": false
            },
            "updated_on": {
               "type": "date",
               "format": "strict_date_optional_time||epoch_millis"
            },
            "variants": {
               "type": "string"
            }
         }
      }
   }
}

PUT /store/price_list_tier/1
{
   "_id": "581375b968943f0e98a6266e",
   "name": "n1",
   "variants": [
      "*"
   ],
   "upcharges": [
      {
         "size": "XXL",
         "amount": 0
      },
      {
         "size": "2XL",
         "amount": 0
      },
      {
         "size": "3XL",
         "amount": 0
      },
      {
         "size": "4XL",
         "amount": 2.5
      },
      {
         "size": "5XL",
         "amount": 2.5
      }
   ],
   "min_qty": 1,
   "max_qty": 1,
   "type": "USD",
   "amount": 3.7,
   "created_on": "2016-10-13T12:18:59.172+0000",
   "updated_on": null,
   "integrations": {
      "lemonstand": {
         "last_sync": "2016-10-17T21:02:32.426+0000",
         "price_tiers": [
            {
               "sku": "10847363052669",
               "pricetier_id": 25142
            }
         ],
         "shop_customer_group_id": 22
      }
   },
   "account_id": "127fe169-720e-4e4a-ba48-448e054d4ef2",
   "price_list_id": "c7a224bc-67b4-d173-8df1-1d2dfe636890",
   "product_id": "b2ffe54c-7fe6-17e9-1f4e-ce1826ec7a1b",
   "product_number": "4400CY",
   "sku": "10847363052669",
   "product_name": "Creeper "
}

Hi @bwillis614,

the example helped me reproduce the problem and I could also solve your original problem with mixed long double values this way.

The only change I made was here:

            "upcharges": {
               "type": "object",
               "enabled": false,
               "include_in_all": false,
               "properties": {
                   "amount": {
                       "type": "double"
                   }
               }
            }

For reference, I'll also include the complete mapping:

PUT /price_list_tier
{
   "mappings": {
      "price_list_tier": {
         "_source": {
            "excludes": [
               "upcharges",
               "integrations.lemonstand.price_tiers"
            ]
         },
         "properties": {
            "account_id": {
               "type": "string"
            },
            "amount": {
               "type": "double"
            },
            "created_on": {
               "type": "date",
               "format": "strict_date_optional_time||epoch_millis"
            },
            "integrations": {
               "properties": {
                  "lemonstand": {
                     "properties": {
                        "last_sync": {
                           "type": "date",
                           "format": "strict_date_optional_time||epoch_millis"
                        },
                        "shop_customer_group_id": {
                           "type": "string"
                        },
                        "price_tiers": {
                           "type": "object",
                           "enabled": false,
                           "include_in_all": false
                        }
                     }
                  }
               }
            },
            "max_qty": {
               "type": "long"
            },
            "min_qty": {
               "type": "long"
            },
            "name": {
               "type": "string"
            },
            "price_list_id": {
               "type": "string"
            },
            "product_id": {
               "type": "string"
            },
            "product_name": {
               "type": "string"
            },
            "product_number": {
               "type": "string"
            },
            "sku": {
               "type": "string"
            },
            "type": {
               "type": "string"
            },
            "upcharges": {
               "type": "object",
               "enabled": false,
               "include_in_all": false,
               "properties": {
                   "amount": {
                       "type": "double"
                   }
               }
            },
            "updated_on": {
               "type": "date",
               "format": "strict_date_optional_time||epoch_millis"
            },
            "variants": {
               "type": "string"
            }
         }
      }
   }
}

With this change, I could add your document to the index without any problems:

POST /store/price_list_tier/581375b968943f0e98a6266e
{
   "name": "n1",
   "variants": [
      "*"
   ],
   "upcharges": [
      {
         "size": "XXL",
         "amount": 0
      },
      {
         "size": "2XL",
         "amount": 0
      },
      {
         "size": "3XL",
         "amount": 0
      },
      {
         "size": "4XL",
         "amount": 2.5
      },
      {
         "size": "5XL",
         "amount": 2.5
      }
   ],
   "min_qty": 1,
   "max_qty": 1,
   "type": "USD",
   "amount": 3.7,
   "created_on": "2016-10-13T12:18:59.172+0000",
   "updated_on": null,
   "integrations": {
      "lemonstand": {
         "last_sync": "2016-10-17T21:02:32.426+0000",
         "price_tiers": [
            {
               "sku": "10847363052669",
               "pricetier_id": 25142
            }
         ],
         "shop_customer_group_id": 22
      }
   },
   "account_id": "127fe169-720e-4e4a-ba48-448e054d4ef2",
   "price_list_id": "c7a224bc-67b4-d173-8df1-1d2dfe636890",
   "product_id": "b2ffe54c-7fe6-17e9-1f4e-ce1826ec7a1b",
   "product_number": "4400CY",
   "sku": "10847363052669",
   "product_name": "Creeper "
}

Daniel

thanks for all the help. This must be a node/driver issue. I still get
the error.

Index price_list_tier does not exist, attempting to create

mapping:
{"price_list_tier":{"_source":{"excludes":["upcharges","integrations.lemonstand.price_tiers"]},"properties":{"account_id":{"type":"string"},"amount":{"type":"double"},"created_on":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"integrations":{"properties":{"lemonstand":{"properties":{"last_sync":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"shop_customer_group_id":{"type":"string"},"price_tiers":{"type":"object","enabled":false,"index":"no","include_in_all":false}}}}},"max_qty":{"type":"long"},"min_qty":{"type":"long"},"name":{"type":"string"},"price_list_id":{"type":"string"},"product_id":{"type":"string"},"product_name":{"type":"string"},"product_number":{"type":"string"},"sku":{"type":"string"},"type":{"type":"string"},
"upcharges":{"type":"object","enabled":false,"include_in_all":false,"properties":{"amount":{"type":"double"}}}
,"updated_on":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"variants":{"type":"string"}}}}

Successfully created index price_list_tier

[ { type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'illegal_argument_exception',
reason: 'mapper [upcharges.amount] of different type, current_type
[long], merged_type [double]' } },
{ type: 'mapper_parsing_exception',
reason: 'failed to parse',
caused_by:
{ type: 'illegal_argument_exception',
reason: 'mapper [upcharges.amount] of different type, current_type
[long], merged_type [double]' } } ]

Hi @bwillis614,

bummer. It worked fine in Console. Did you try it there too?

That's weird. Is your index name really price_list_tier? In the example above it was called store.

Daniel

yeah, i cheated. I took your original reply and pasted my stuff into it.
I just missed that part. Index = price_list_tier.
I have not tried in console. just node/driver.
I haven't had issues with my other mappings, just this one. I'm assuming
because of disparate upcharges.amount types.
I see my other field at root level also named amount, but shares type of
double, which is desired.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.