The JVM will continue to use memory up to what you set it to use (max mem).
Once the JVM gets that memory, it will not free it to the OS, but
internally, memory will be freed once the garbage collector runs.
When you run it one by one, do you still get OutOfMemory errors?
On Sun, Nov 28, 2010 at 3:38 PM, Martijn Laarman firstname.lastname@example.org wrote:
thats what i thought at first too but i am putting a breakpoint afterwards
and even if i wait for the responses to come back in i can see
java's memory growing gradually.
Even if i use fiddler to manually fire the requests one by one i am seeing
java jumping 4 to 10mb after each request but never releasing it.
I am using 13.0 with out of the box settings.
Calling _flush has no effect either.
On Sat, Nov 27, 2010 at 10:04 PM, Shay Banon <email@example.com
I see that you call IndexAsync, is there a chance that you are simply
creating too many concurrent bulk indexing requests into the server? Since
you don't wait for the bulk indexing request to get back, there might
eventually be hundreds of concurrent bulk indexing requests happening on the
server, eventually causing it to max out on mem. If you have 10-15
concurrent indexing actions, does it still happen?
On Sat, Nov 27, 2010 at 9:49 PM, Martijn Laarman firstname.lastname@example.org:
I'm creating a rather naive implementation of the bulk API (naive in
the sense its not zero copy although i plan too support that later
Im using the hacker news database dump to insert data en masse into ES
This fires a lot of HTTP bulk inserts calls to ES:
Example call and response:
If i step through my code after each
'client.IndexAsync(postQueue);' I see java's memory growing but
and never recovering from that memory whilst my indexer gets rid of
the postQueue in memory. Java jumps from 200mb to 1,5 gb when it
starts to spit out javaheap space errors since it can no longer
allocate anymore memory. The indexer stays at around 20mb's throughout
Any idea what might be going on ?