BoffinPanda
(Juggernaut Panda)
April 2, 2018, 12:24am
1
I have a large data gathering system which gives me records of size bigger than 10k.
Now, ES allows me to read the last 10k records only.
How do I tackle this problem? Do I split indices to accommodate 10 k records or do something else?
Here is how I get the size of records:
http://localhost:9200/netflow-2018.02.20/_search?pretty=true&size=0&q= :
What should I do to streamline the crawl request?
Substitute &scroll=10m with something else?
zqc0512
(andy_zhou)
April 2, 2018, 1:29am
2
docs mas search configure.
"action" : {
"search" : {
"shard_count" : {
"limit" : "20000"
}
}
},
dadoonet
(David Pilato)
April 2, 2018, 3:29am
3
It depends on what you are after:
exporting data in a consistent way: use scroll API
want to allow deep pagination, use search_after feature
If it's something else, could you describe the user story?
system
(system)
Closed
April 30, 2018, 3:29am
4
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.