Can I put 300m or 500m in scroll time? Since I got some error and based from my research, (maybe) the reason was my scroll got expired.
TIA
Here is a part of an error.
File "/usr/lib/python2.7/site-packages/elasticsearch/connection/base.py", line 125, in _raise_error
raise HTTP_EXCEPTIONS.get(status_code, TransportError)(status_code, error_message, additional_info)
elasticsearch.exceptions.NotFoundError: TransportError(404, u'search_phase_execution_exception', u'No search context found for id [7673]')
Might be the problem.
But it should not take hundred of minutes to extract a few documents.
What exactly are you running?
I'm transferring data from elasticsearch to postgresql using python script.
hmm. I'm transferring hundred of thousands of data.
I meant: what exact command are you running?
page = es.search(
index = "_all",
scroll = '10m',
size = 1000,
body = {
"query" :{
"bool" : {
"must" : [
{"match" : {"type" : log}}
],
"filter" : [
{"range" :
{"@timestamp" :
{"gte" : str(filter_time)}
}
}
],
"must_not": [
{ "exists" :
{"field" : "postgres_flag"}
}
]
}}
})
This is my query for elasticsearch
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.