Hi,
I have a problem where someone logs lots of different JSON logs into Logstash which forwards them to Elasticsearch. There are so many fields that we hit the 1000 threshold of fields per index. To prevent mapping explosion we don't want to raise this limit.
I read about using "key/value with nested fields" in Limit of total fields [1000] in index has been exceeded and saw an example in https://www.elastic.co/blog/found-crash-elasticsearch#mapping-explosion but I wonder how I would achieve that with Logstash.
Could anyone give me a hint how to split long fieldnames (containing .) in a way I can avoid mapping explosion?
My problem is that there might come in new field names any time so I need an automatic way of splitting.
Thanks in advance!