How to cope with huge nested fields?


(Milkana Stateva) #1

Hi all,
We have es 6.x with mapping of nested fields with various values and sizes. Some of the fields contain up to 1M short string values. The user should be able to filter and perform aggregations on them in reasonable time (ms).
We want to optimize the queries against this fields, by splitting them to smaller chunks, and we want to know which is better.
the field mapping is : {
myField:{
type: nested,
properties: {
id: keyword
values: [
text,
fields: {
keyword
}
]
}}}

option 1:
Split the fields by limited "values" size
Instead of one doc with 1M values we can insert 10 docs with 100K values each
option 2:
use an array of nested objects, so each chunk gets its own document

Thanks,
Nirit and Milkana


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.