Hi,
We made a mistake to store details_per_category in this way
{
cat1: object,
cat2: object,
. ..
}
Now there are thousands of categories, which generated thousands of new fields in out index. We corrected it by storing details per category in array:
[
{
"category" : catid,
...other info (3 fields more)
},
...
]
Unfortunately, now we sometimes run in out of memory on a node because number of fields in index is around 5000.
Can we just remove all those properties from mapping?:
details_per_category.cat1, details_per_category.cat2 etc
We cleaned up docs and there are no more documents containing this fields. However, when I try to change mapping, to define only this
{
"properties": {
"details_per_category": {
"properties": {
"name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"category_id": {
"type": "long"
},
.. // 3 more fields
}
}
}
}
And execute get mapping after this, it returns me the old fields again.
Is the only solution reindexing data using reindex operation (maybe in combination with alias) or we can change this mapping and lower the value for fields limit on existing index?
Thanks in advance,
Amer