I have written python code to get the data from elasticsearch using 'mget' function, but soon after my index count is reached to 360k docs, I'm getting elasticsearch timeout error frequently though I keep request_timeout=100, 200....,
Here is the script which I am using to fetch the documents from elasticsearch
ID =['id1', 'id2', 'id3', 'id4', ..............'id2500']
es.mget(index=indexname, doc_type=doctype, body={'ids': ID }, request_timeout=100)
As a workaround I did the following :
Since I am trying to fetch 25K records through mget which is getting timed out. So I reduce the amount of data to be fetched for example, fetching 5K records. It's working fine, but it is hitting my script performance ....
Can anyone please help me out to get out of this error.