GC running early?

I'm working on setting up a new cluster (new hardware), and I'm seeing unexpected heap behavior while doing some load testing. Elasticsearch 5.2.2, OpenJDK 8, SL7 (i.e. RHEL7). All nodes are configured with 16gb heap, verified in the logs and API, yet GC appears to be running around 1.5gb. This is under constant indexing, a mix of log-like and document-like (updating) behavior. One node stands out, but I don't see anything else unusual about it (it's not the master, and all nodes are configured identically with puppet). Any suggestions?

Thanks,
Kevin

Complete JVM options:

-Dfile.encoding=UTF-8
-Dio.netty.noKeySetOptimization=true
-Dio.netty.noUnsafe=true
-Dio.netty.recycler.maxCapacityPerThread=0
-Djava.awt.headless=true
-Djava.awt.headless=true
-Djdk.io.permissionsUseCanonicalPath=true
-Djna.nosys=true
-Dlog4j.shutdownHookEnabled=false
-Dlog4j.skipJansi=true
-Dlog4j2.disable.jmx=true
-XX:+AlwaysPreTouch
-XX:+DisableExplicitGC
-XX:+HeapDumpOnOutOfMemoryError
-XX:+UseCMSInitiatingOccupancyOnly
-XX:+UseConcMarkSweepGC
-XX:CMSInitiatingOccupancyFraction=75
-Xms16g
-Xmx16g
-Xss1m
-server

Based on the graph it looks like your nodes are configured with 2GB heap, not 16GB. How did you install Elasticsearch? How are you starting it?

1 Like

Right, hence my confusion. API clearly shows 16gb though [1]. The nodes were installed and configured with the elastic/elasticseach puppet forge module, and are run under systemd. Now that I've loaded more data the heap usage has gone up on the busier nodes [2], so maybe it was just a matter of getting more activity. This is a much larger cluster than we were using before.

[1]

name       heap.current heap.max
esworker08        1.6gb   15.8gb
esworker26        1.5gb   15.8gb
esworker05        1.5gb   15.8gb
esworker31        2.4gb   15.8gb
esworker01        1.4gb   15.8gb
esworker19          1gb   15.8gb
esworker32        1.1gb   15.8gb
esworker13        2.2gb   15.8gb
esclient01          7gb   15.9gb
esworker15        1.1gb   15.8gb
esclient02          1gb   15.9gb
esworker24        2.2gb   15.8gb
esworker16        1.5gb   15.8gb
esworker06        1.1gb   15.8gb
esworker29        1.9gb   15.8gb
esworker09        1.9gb   15.8gb
esworker11        960mb   15.8gb
esworker20      856.4mb   15.8gb
esworker33        9.2gb   15.8gb
esworker28        1.2gb   15.8gb
esworker07        1.9gb   15.8gb
esworker03          1gb   15.8gb
esworker25        1.1gb   15.8gb
esworker14      832.5mb   15.8gb
esworker02        1.4gb   15.8gb

[2]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.