Using "key/value with nested fields" with logstash

Hi,

I have a problem where someone logs lots of different JSON logs into Logstash which forwards them to Elasticsearch. There are so many fields that we hit the 1000 threshold of fields per index. To prevent mapping explosion we don't want to raise this limit.

I read about using "key/value with nested fields" in Limit of total fields [1000] in index has been exceeded and saw an example in https://www.elastic.co/blog/found-crash-elasticsearch#mapping-explosion but I wonder how I would achieve that with Logstash.

Could anyone give me a hint how to split long fieldnames (containing .) in a way I can avoid mapping explosion?

My problem is that there might come in new field names any time so I need an automatic way of splitting.

Thanks in advance!

I just saw that the "dedot" filter of Logstash provides the capability to replace dots with nested fields, not just replace dots with another character. Could this solve my problem?

Will heavily nested be fields be better with Elasticsearch than lots of different fieldnames? (Maybe I should ask this in the Elasticsearch forums, too)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.