kyle_che
(Kyle Stephenson)
February 6, 2018, 8:01pm
1
I recently upgraded to logstash 6.1.3 and I am now getting Java Heap Space Out of Memory errors. I have 32 Gb of memory and I gave the jvm 16Gb. I did not have this error on the previous version of 6.0.1. Is this a bug or am I missing something here?
-Xms16g
-Xmx16g
[2018-02-06T19:28:24,262][WARN ][io.netty.channel.nio.NioEventLoop] Unexpected exception in the selector loop.
java.lang.OutOfMemoryError: Java heap space
kyle_che
(Kyle Stephenson)
February 7, 2018, 4:00pm
2
I have updated to the latest 6.2.0. I am still running out of Java Heap space. Is there a limit to how many nodes can connect to logstash?
[2018-02-07T15:55:21,960][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {:pipeline_id=>"main", "exception"=>"Java heap space", "backtrace"=>["org.joni.StackMachine.ensure1(StackMachine.java:103)", "org.joni.StackMachine.push(StackMachine.java:167)", "org.joni.StackMachine.pushAlt(StackMachine.java:205)", "org.joni.ByteCodeMachine.opAnyCharMLStar(ByteCodeMachine.java:901)", "org.joni.ByteCodeMachine.matchAt(ByteCodeMachine.java:231)", "org.joni.Matcher.matchCheck(Matcher.java:304)", "org.joni.Matcher.searchInterruptible(Matcher.java:457)", "org.jruby.RubyRegexp$SearchMatchTask.run(RubyRegexp.java:268)", "org.jruby.RubyRegexp$SearchMatchTask.run(RubyRegexp.java:249)", "org.jruby.RubyThread.executeTask(RubyThread.java:1485)", "org.jruby.RubyRegexp.matcherSearch(RubyRegexp.java:232)", "org.jruby.RubyRegexp.search19(RubyRegexp.java:1222)", "org.jruby.RubyRegexp.search19(RubyRegexp.java:1177)", "org.jruby.RubyRegexp.matchPos(RubyRegexp.java:1153)", "org.jruby.RubyRegexp.match19Common(RubyRegexp.java:1121)", "org.jruby.RubyRegexp.match_m19(RubyRegexp.java:1107)", "org.jruby.RubyRegexp$INVOKER$i$match_m19.call(RubyRegexp$INVOKER$i$match_m19.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodOneOrNBlock.call(JavaMethod.java:384)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_3_dot_0.gems.jls_minus_grok_minus_0_dot_11_dot_4.lib.grok_minus_pure.invokeOther1:match(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:182)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_3_dot_0.gems.jls_minus_grok_minus_0_dot_11_dot_4.lib.grok_minus_pure.RUBY$method$execute$0(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:182)", "java.lang.invoke.LambdaForm$DMH/1577093677.invokeStatic_L7_L(LambdaForm$DMH)", "java.lang.invoke.LambdaForm$MH/693083472.invokeExact_MT(LambdaForm$MH)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:103)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:163)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_filter_minus_grok_minus_4_dot_0_dot_2.lib.logstash.filters.grok.timeout_enforcer.invokeOther10:execute(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.2/lib/logstash/filters/grok/timeout_enforcer.rb:20)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_3_dot_0.gems.logstash_minus_filter_minus_grok_minus_4_dot_0_dot_2.lib.logstash.filters.grok.timeout_enforcer.RUBY$method$grok_till_timeout$0(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.2/lib/logstash/filters/grok/timeout_enforcer.rb:20)", "java.lang.invoke.LambdaForm$DMH/53762591.invokeStatic_L9_L(LambdaForm$DMH)", "java.lang.invoke.LambdaForm$MH/1083292099.invokeExact_MT(LambdaForm$MH)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:129)"], :thread=>"#<Thread:0x58ea5a89@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-02-07T15:55:41,049][ERROR][org.logstash.Logstash ] java.lang.OutOfMemoryError: Java heap space
kyle_che
(Kyle Stephenson)
February 7, 2018, 4:03pm
3
I'm showing 238 established connections on port 5044.
kyle_che
(Kyle Stephenson)
February 8, 2018, 5:05pm
4
I was able to get around this by off loading the clients to multiple logstash instances and also by adding more cpu's to handle the high load of ingesting. The errors have now gone away.
system
(system)
Closed
March 8, 2018, 5:05pm
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.