Hello,
I have three Indexes, one have 11 million documents, I need to do 35.000 searches in that index. Is there a way to that and not increase my JVM Heap to the point where elastic fails? I'm using the Scroll API, and making queries that have size of 5000.
I also get this error:
org.elasticsearch.client.ResponseException: method [POST], host , URI [/ensinosuperior/_search?typed_keys=true&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&search_type=query_then_fetch&batched_reduce_size=512], status line [HTTP/1.1 429 Too Many Requests]
{"error":{"root_cause":[{"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [8127961922/7.5gb], which is larger than the limit of [8127315968/7.5gb], real usage: [8127961440/7.5gb], new bytes reserved: [482/482b]","bytes_wanted":8127961922,"bytes_limit":8127315968,"durability":"PERMANENT"}],"type":"circuit_breaking_exception","reason":"[parent] Data too large, data for [<http_request>] would be [8127961922/7.5gb], which is larger than the limit of [8127315968/7.5gb], real usage: [8127961440/7.5gb], new bytes reserved: [482/482b]","bytes_wanted":8127961922,"bytes_limit":8127315968,"durability":"PERMANENT"},"status":429}