OK, thanks. Although we do have the multi-MB fields, they are few and
far between and we're planning on at least 16 of not 32GB of memory
per server (x64) and greatly enlarging the java memory limit, so
hopefully memory won't be an issue.
On Mar 10, 12:07 am, Shay Banon shay.ba...@elasticsearch.com wrote:
There isn't. But note, with this large fields / documents, they have to be represented in memory on elasticsearch server. Make sure you have plenty of it.
On Wednesday, March 9, 2011 at 11:39 PM, Matt Paul wrote:
Thanks Shay! Is there any limit on the length of a field (text or
On Mar 9, 12:30 pm, Shay Banon shay.ba...@elasticsearch.com wrote:
There is a built in limit in the HTTP (chunk handling) layer that limits requests to 100mb. You can set it using http.max_content_length (for example, set it to a bigger value).
On Wednesday, March 9, 2011 at 4:47 PM, Matt Paul wrote:
I can't seem to find this in the docs, so if it exists, I apologize, but is there a default size limit for a text field? Also, is there a size limit for an HTTP request body in the REST API? I seem to be having some issues with sending requests that some part of it is too large, and I'm not sure if it's a single large field (16MB works, 50MB doesn't) or the entire bulk insert request (again, 50MB works, 100MB doesn't)