Problems while inserting data to index

I am getting following error :

{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Limit of total fields [1000] in index [priceline_hotels] has been exceeded"}].

The same error i got for another indices also but was resolved with the following :

PUT - */_settings
"index.mapping.total_fields.limit": 3000

However after increasing the limit also it is throwing the same error for this index. Please advise.

Refer to section about avoiding mapping explosion

Increasing this for the index is the workaround solution. After this run a Get request on index settings to verify the parameter is applied (if you get the error it means it was not applied to the index)
Note this is only workaround solution in most cases and the real solution is to avoid using sparse fields and stay well below limit

Hi Julien,

I did this, but I am getting the same error. I increased the field limit to 5000. When I am getting settings it is showing the field limit as 5000, which means field limit is getting applied but in response I am getting the same error "Limit of total fields [1000] in index has been exceeded".
Please advise

Can you please read the limit with the settings api :
GET priceline_hotels/_settings

You should see "limit": "3000"

If you don't, the apply it directly to the index PUT priceline_hotels/_settings, please confirm the output is "acknowledge": true...
If it is and the settings is not applied (in the cluster state), then you might be in a split-brain situation so I would first verify minimum_master_nodes is set to more than half the total number of master-eligible nodes ( and then check elasticsearch logs which would be related to any issue around cluster state...

Hi Julien,
Acknowledge is true and also when I am checking by using GET priceline_hotels/_settings it is returning field limit as 5000.


can anyone help in this

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.