Limit of total fields [1000] in index has been exceeded particular json's

I have huge json files, so i split the json files and index them one by one but no matter how smallest the split gets, i get limit total fields exception,
Im using elastisearch on python, and i splitthe json objects through python dictionaries is there anything i can do to prevent this exception
Interesting thing is it gives this is exception on particular JSON files not in all of them, is there anything wrong with json or split but then should i get the limit exception in every json file ?

btw can anyone tell me how can i configure limit of fields in elasticsearch.yml file

Are you sure you want to have more than 1000 fields?
You should check your json IMO.

But look at https://www.elastic.co/guide/en/elasticsearch/reference/7.6/mapping.html#mapping-limit-settings

Hey,
try splitting your jsons to less keys or try to create keys that contains metadata on your keys (like if you got an id so dont make a key for every id, create a mapping key that his name is id and query him when you need specific id).
And about the limit of fields, you configure it on the index settings and not on the elasticsearch.yml like this: Total Fields Limit setting

Hi @Bahadir_Eyuboglu.

you can also add the max limitation value is 10000 (default) at time of running the Elastic query.

`localhost:9200/kibana_sample_data_flights/_search?size=10000`

in storage size=10mb

error;-reason "Result window is too large, from + size must be less than or equal to: [10000] but was [100001]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."

Thanks
HadoopHelp

Well it's not in my hands i'm just at the database part of the project. the json files comes to me with these filters set

Thank's i will try once i have succesfully index the data

I know it but i dont use kibana so i try setting from terminal using
curl -X PUT localhost:9200/restaurants -H 'Content-Type: application/json' -d'{ "index.mapping.total_fields.limit": 2000 } ''

and the error i get is,

{"error":{"root_cause":[{"type":"parse_exception","reason":"unknown key [index.mapping.total_fields.limit] for create index"}],"type":"parse_exception","reason":"unknown key [index.mapping.total_fields.limit] for create index"},"status":400}

can anyone help

You are missing the "settings" key here. Have a look at the documentation. https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-create-index.html

But even if you will increase your max limit of fields, and you still adding more keys per document you will pass the new limit too.
You need to solve the problem from the JSON side because if not you will lose performence

Thanks didnt even noticed

yeah but i dont think there will be 10000 fields but i will keep in mind

thanks for the advice