Limit of total fields [1000] in index has been exceeded after changing case classes to maps

Hello,
I have a Scala Spark job that's writing output data to ES index. I modified my code to recursively convert Scala classes into Maps before writing those to the index. Before (when using case classes) index write was working OK. Now I'm getting the "Limit of total fields ..." error. As far as I can tell the number of fields being written has not changed. I have 538 fields defined in JSON index mapping file (including nested fields) and dynamic mapping is disabled. I wrote a small program to count individual fields in the JSON document that is failing and I'm getting a count of 311 fields. The only way I can get close to 1000 fields is if I count fields in all entries in Lists individually (entry 1 field count + entry 2 field count, etc.). When I count fields that way I get 945 fields, which is still below 1000.
I'm not sure what is causing that error. Are maps somehow excluded from the global dynamic index setting? I'm using ES 6.7, this is the beginning of my index definition file:
{ "sourcing": { "_source": { "enabled": true }, "_all": { "enabled": false }, "dynamic": false, "properties": { ...
I'd appreciate it if someone could at least confirm if Maps are excluded from dynamic mapping setting and explain how field count is generated (if fields of maps inside lists are counted only for the first entry or for each entry separately).
Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.