Hello
I have following config :
2 client node
3 master
3 data+ingest node
my cluster work only with 1 data node and 2 of them get following error :
[2018-02-03T18:22:54,183][WARN ][o.e.m.j.JvmGcMonitorService] [es-data-02] [gc][1984] overhead, spent [1.4m] collecting in the last [1.4m]
[2018-02-03T18:33:50,485][INFO ][o.e.m.j.JvmGcMonitorService] [es-data-02] [gc][old][1985][422] duration [10.9m], collections [76]/[10.9m], total [10.9m]/[52.4m], memory [3.8gb]->[3.8gb]/[3.8gb], all_pools {[young] [865.3mb]->[865.3mb]/[865.3mb]}{[survivor] [108mb]->[107.8mb]/[108.1mb]}{[old] [2.9gb]->[2.9gb]/[2.9gb]}
[2018-02-03T18:33:50,486][WARN ][o.e.m.j.JvmGcMonitorService] [es-data-02] [gc][1985] overhead, spent [10.9m] collecting in the last [10.9m]
[2018-02-03T18:34:38,367][ERROR][o.e.t.n.Netty4Utils ] fatal error on the network layer
at org.elasticsearch.transport.netty4.Netty4Utils.maybeDie(Netty4Utils.java:185)
at org.elasticsearch.transport.netty4.Netty4MessageChannelHandler.exceptionCaught(Netty4MessageChannelHandler.java:83)
at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:285)
at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:264)
at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:256)
at
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.handleReadException(AbstractNioByteChannel.java:104)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:145)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:544)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:498)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at java.lang.Thread.run(Thread.java:745)
[2018-02-03T18:42:31,063][ERROR][o.e.b.ElasticsearchUncaughtExceptionHandler] [es-data-02] fatal error in thread [elasticsearch[es-data-02][search][T#14]], exiting
java.lang.OutOfMemoryError: Java heap space
[2018-02-03T18:42:31,061][ERROR][o.e.b.ElasticsearchUncaughtExceptionHandler] [es-data-02] fatal error in thread [elasticsearch[es-data-02][generic][T#6]], exiting
java.lang.OutOfMemoryError: Java heap space
[2018-02-03T18:33:59,924][ERROR][o.e.b.ElasticsearchUncaughtExceptionHandler] [es-data-02] fatal error in thread [elasticsearch[es-data-02][management][T#3]], exiting
java.lang.OutOfMemoryError: Java heap space
at java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300) ~[?:1.8.0_111]
at java.lang.StringCoding.encode(StringCoding.java:344) ~[?:1.8.0_111]
at java.lang.String.getBytes(String.java:918) ~[?:1.8.0_111]
at java.io.UnixFileSystem.canonicalize0(Native Method) ~[?:1.8.0_111]
at java.io.UnixFileSystem.canonicalize(UnixFileSystem.java:172) ~[?:1.8.0_111]
at java.io.File.getCanonicalPath(File.java:618) ~[?:1.8.0_111]
at java.io.FilePermission$1.run(FilePermission.java:215) ~[?:1.8.0_111]
at java.io.FilePermission$1.run(FilePermission.java:203) ~[?:1.8.0_111]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_111]
at java.io.FilePermission.init(FilePermission.java:203) ~[?:1.8.0_111]
at java.io.FilePermission.<init>(FilePermission.java:277) ~[?:1.8.0_111]
at java.lang.SecurityManager.checkRead(SecurityManager.java:888) ~[?:1.8.0_111]
at sun.nio.fs.UnixPath.checkRead(UnixPath.java:795) ~[?:?]
at sun.nio.fs.UnixFileAttributeViews$Basic.readAttributes(UnixFileAttributeViews.java:49) ~[?:?]
at sun.nio.fs.UnixFileSystemProvider.readAttributes(UnixFileSystemProvider.java:144) ~[?:?]
at sun.nio.fs.LinuxFileSystemProvider.readAttributes(LinuxFileSystemProvider.java:99) ~[?:?]
at java.nio.file.Files.readAttributes(Files.java:1737) ~[?:1.8.0_111]
at java.nio.file.Files.size(Files.java:2332) ~[?:1.8.0_111]
at org.apache.lucene.store.FSDirectory.fileLength(FSDirectory.java:243) ~[lucene-core-6.6.1.jar:6.6.1 9aa465a89b64ff2dabe7b4d50c472de32c298683 - varunthacker - 2017-08-29 21:54:39]
at org.apache.lucene.store.FilterDirectory.fileLength(FilterDirectory.java:67) ~[lucene-core-6.6.1.jar:6.6.1 9aa465a89b64ff2dabe7b4d50c472de32c298683 - varunthacker - 2017-08-29 21:54:39]
at org.apache.lucene.store.FilterDirectory.fileLength(FilterDirectory.java:67) ~[lucene-core-6.6.1.jar:6.6.1 9aa465a89b64ff2dabe7b4d50c472de32c298683 - varunthacker - 2017-08-29 21:54:39]
at org.elasticsearch.index.store.Store$StoreStatsCache.estimateSize(Store.java:1402) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.index.store.Store$StoreStatsCache.refresh(Store.java:1391) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.index.store.Store$StoreStatsCache.refresh(Store.java:1378) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.common.util.SingleObjectCache.getOrRefresh(SingleObjectCache.java:54) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.index.store.Store.stats(Store.java:332) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.index.shard.IndexShard.storeStats(IndexShard.java:703) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.action.admin.indices.stats.CommonStats.<init>(CommonStats.java:177) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.action.admin.indices.stats.TransportIndicesStatsAction.shardOperation(TransportIndicesStatsAction.java:163) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.action.admin.indices.stats.TransportIndicesStatsAction.shardOperation(TransportIndicesStatsAction.java:47) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.action.support.broadcast.node.TransportBroadcastByNodeAction$BroadcastByNodeTransportRequestHandler.onShardOperation(TransportBroadcastByNodeAction.java:433) ~[elasticsearch-5.6.3.jar:5.6.3]
at org.elasticsearch.action.support.broadcast.node.TransportBroadcastByNodeAction$BroadcastByNodeTransportRequestHandler.messageReceived(TransportBroadcastByNodeAction.java:412) ~[elasticsearch-5.6.3.jar:5.6.3]
would you please help me to solve this issue
best