I wanted to some edits in our visualisations today, and noticed that our index (this is not logstash, but live online game analytics data) has grown to 50k+ fields. This is pretty normal since we expand all the user data dynamically, and I do dynamic mapping on those fields so that they don't get indexed. I just need to see that part of the data, not search
Now the rest of the core fields, around 2000 of those worked just fine and I do visualise based on those but today, when we added a field there I couldn't refresh the field list any more. I tried deleting the index pattern, closing all but the last timed index, increasing timeouts from 30 to 90s, increased maximum payload bytes a few orders of magnitude... kibana always just returns an empty field list.
This wasn't a problem with 1.x series btw
Elasticsearch is fine, our other analytics stuff that does direct data pulls from ES are doing just fine, ES servers are almost idling. But Kibana... it's running in a container so I was easily able to confirm that it's not running out of memory.
When I pulled the current mapping via curl, the size is around 1.2M
Is there any way to 'force-feed' the .kibana with the current mappings so that I can continue?
We're also looking into reducing some of the data to get rid of 20k+ fields but that won't help me today