Hello! I have a problem when I do a query. The mapping is very simple :
{
"index": {
"aliases": {},
"mappings": {
"level1": {
"properties": {
"id": {
"type": "string"
},
"level2": {
"type": "nested",
"properties": {
"level3": {
"type": "nested",
"properties": {
"value1": {
"type": "string"
},
"value2": {
"type": "long"
},
"id": {
"type": "string"
},
"value3": {
"type": "long"
}
}
},
"id": {
"type": "string"
}
}
}
}
}
},
"settings": {
"index": {
"creation_date": "1505476515647",
"number_of_shards": "5",
"number_of_replicas": "1",
"uuid": "_0IiQCPrQ1i-kDP1481y8w",
"version": {
"created": "2030099"
}
}
},
"warmers": {}
}
}
And the query is :
{"query": {"terms": {"_id": [ "value51" ] }}}
When I do the query in Python I receive data with this structure:
_source (dict)
level1 (list)
level2 (list)
data1 (dict)
id
value1
value2
value3
data2 (dict)
data3 (dict)
...
data65000 (dict)
The problem is that 65,000 data are too many, and I run out of memory, I would like to know if _search or ElasticSearch in general has some way of bringing that information (data1,data2,data3...) in batches, or if there is some way to make that query so that I do not run out of memory on the computer. Any idea?
Thank you.