Hi all,
We have es 6.x with mapping of nested fields with various values and sizes. Some of the fields contain up to 1M short string values. The user should be able to filter and perform aggregations on them in reasonable time (ms).
We want to optimize the queries against this fields, by splitting them to smaller chunks, and we want to know which is better.
the field mapping is : {
myField:{
type: nested,
properties: {
id: keyword
values: [
text,
fields: {
keyword
}
]
}}}
option 1:
Split the fields by limited "values" size
Instead of one doc with 1M values we can insert 10 docs with 100K values each
option 2:
use an array of nested objects, so each chunk gets its own document
Thanks,
Nirit and Milkana