Max field count and performances impact

I am trying several mappings for my documents, and it turned out that the most convenient one is a data model with thousand of fields on my document.

The mapping [{param:ALT, value:40000}, {param:LON, value:1.36}, {param:SPEED, value:0.76}, ...] was the best one in terms of flexibility, but querying it to have for instance SPEED average for ALT buckets of 1000ft is not easy to do, and especially not possible with kibana

I'd rather the following: {ALT:40000, LON:1.36, SPEED:0.76} which allows easy querying

In my situation (aircraft data) the count of parameters (fields) is reaching several thousands.

Is there a limitation on this? Or can you help me on the first mapping ?


Hi @dao,

yes, there is a limit on the maximum number fields in an index and the default value is 1000 which is already very high so I suggest not to increase it even further.

It depends a bit on your use-case but I also think that mapping #2 is more straightforward. When creating a data model you should think about the structure in your data and how you want to query it. Maybe you can split the data into multiple (use-case specific) indices?


OK, I can partition the fields.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.