Logstash stopping randomly after few hours

Hi,
I'm getting a random error when I run logstash:

16:30:26.240 [[main]>worker0] ERROR logstash.pipeline - Exception in
pipelineworker, the pipeline stopped processing new events, please
check your filter configuration and restart Logstash.
{"exception"=>#<InterruptedRegexpError: Regexp Interrupted>,
"backtrace"=>["org/jruby/RubyString.java:3101:in gsub'", "org/jruby/RubyString.java:3069:in gsub'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:317:in
gsub_dynamic_fields'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:308:in gsub'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:290:in gsub'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:207:in
filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in do_filter'",
"/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in
multi_filter'", "org/jruby/RubyArray.java:1613:in each'",
"/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in
multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:41:in multi_filter'", "(eval):4135:in initialize'", "org/jruby/RubyArray.java:1613:in each'", "(eval):4131:in
initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):997:in
filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:295:in filter_batch'", "org/jruby/RubyProc.java:281:in call'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:192:in each'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:191:in each'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in
filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:282:in worker_loop'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in
start_workers'"]} 16:30:26.542 [LogStash::Runner] FATAL logstash.runner - An unexpected error occurred! {:error=>#<InterruptedRegexpError: Regexp Interrupted>, :backtrace=>["org/jruby/RubyString.java:3101:in gsub'",
"org/jruby/RubyString.java:3069:in gsub'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:317:in gsub_dynamic_fields'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:308:in
gsub'", "org/jruby/RubyArray.java:1613:in each'",
"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:290:in
gsub'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:207:in filter'",
"/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in
do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in multi_filter'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'",
"/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:41:in
multi_filter'", "(eval):4135:in initialize'",
"org/jruby/RubyArray.java:1613:in each'", "(eval):4131:in initialize'", "org/jruby/RubyProc.java:281:in call'", "(eval):997:in filter_func'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:295:in
filter_batch'", "org/jruby/RubyProc.java:281:in call'",
"/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:192:in
each'", "org/jruby/RubyHash.java:1342:in each'",
"/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:191:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in filter_batch'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:282:in
worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in start_workers'"]}

My logstash config file is:

input {
	file {
		type => "SystemError"
		path => "/app/systemerr/**/*"
		start_position => "beginning"


		codec => multiline {
			pattern => "^\s"
			what => "previous"
		}	   
	}

	file {
		type => "SystemOut"
		path => "/app/systemout/**/*"
       		start_position => "beginning"


		codec => multiline {
			pattern => "^\["
			negate => true
			what => "previous"
		}
   	}
}

filter {
	if [type] == "SystemError" {
		grok {
			match => { "message" => "\[%{DATA:timestamp}] %{BASE16NUM:threadID} (?<shortname>\b[A-Za-z0-9\$]{2,}\b)%{SPACE}%{WORD:loglevel}%{SPACE} %{GREEDYDATA:message}" }
			overwrite => [ "message" ]
		}

		mutate {
			gsub => ["timestamp", " GMT\+05\:30", ""]
		}

		date {
			match => ["timestamp", "M/dd/yy HH:mm:ss:SSS"]
		}
		
		if ([message] =~ "^\tat") {
      			drop {}
		}

		if ([path] =~ "113") {
			mutate {
    				add_field => { "server" => "113" }
  			}
		} else {
			mutate {
    				add_field => { "server" => "117" }
  			}
		}
			
	}

	if [type] == "SystemOut" {
		grok {
			match => { "message" => "\[%{DATA:timestamp}] %{BASE16NUM:threadID} (?<shortname>\b[A-Za-z0-9\$]{2,}\b)%{SPACE}%{WORD:loglevel}%{SPACE} %{GREEDYDATA:message}" }
			overwrite => [ "message" ]
		}

		mutate {
			gsub => ["timestamp", " GMT\+05\:30", ""]
		}

		date {
			match => ["timestamp", "M/dd/yy HH:mm:ss:SSS"]
		}
		
		if ([path] =~ "113") {
			mutate {
    				add_field => { "server" => "113" }
  			}
		} else {
			mutate {
    				add_field => { "server" => "117" }
  			}
		}		
	}
}

Is there anything wrong in the config file? Please help.

What comes after {"exception"=># at the beginning of the error message? Also, please give an example of an event that gives this result.

Hi,

I'm not sure what you mean by after {"exception"=>#. I hve copy pasted the entire error as it appeared. Please elaborate if you require something different.

Also, I'm not sure which event throws the result. Logstash does run for a few hours and is able to parse all events. It only fails after a few hours. However, I have given below an example of the kind of events that are generated:

[6/20/17 1:06:48:276 GMT+05:30] 000015fd SystemErr     R com.asdf.bank.NonFatalException: 
	at com.asdf.bank.eipp.worker.PaymentServicesWorker.fetchTxnDetails(Unknown Source)
	at com.asdf.bank.eipp.ejb.handlers.StatusEnquiryHandler.execute(Unknown Source)
	at com.asdf.bank.common.controller.Conductor.process(Unknown Source)
	at com.asdf.j2ee.utils.controller.RequestProcessorImpl.process(RequestProcessorImpl.java:87)
	at com.asdf.j2ee.utils.controller.ControllerImpl.postFilterProcess(ControllerImpl.java:210)
	at com.asdf.bank.common.controller.Controller.process(Unknown Source)
	at com.asdf.bank.eipp.ejb.PaymentServices.getTxnDetails(Unknown Source)
	at com.asdf.bank.eipp.ejb.RemotePaymentServicesBeanRemote(Unknown Source)

Please tell me if you require anything else.

Thanks

You posted

16:30:26.240 [[main]>worker0] ERROR logstash.pipeline - Exception in
pipelineworker, the pipeline stopped processing new events, please
check your filter configuration and restart Logstash.
{"exception"=>#,
"backtrace"=>["org/jruby/RubyString.java:3101

but I'm quite sure that if you look in your log there's something else that follows "exception"=>#. It should look similar to this:

"exception"=>#<interesting error message here>

Hi,

I could not find any message after {"exception"=>#, but there is another error statement generated. Maybe that can help.

"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in
start_workers'"]} 16:30:26.542 [LogStash::Runner] FATAL
logstash.runner - An unexpected error occurred!
{:error=>#<InterruptedRegexpError: Regexp Interrupted>,
:backtrace=>["org/jruby/RubyString.java:3101:ingsub'",
"org/jruby/RubyString.java:3069:in gsub'",

Full message is:

  "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in
    start_workers'"]} 16:30:26.542 [LogStash::Runner] FATAL
    logstash.runner - An unexpected error occurred!
    {:error=>#<InterruptedRegexpError: Regexp Interrupted>,
    :backtrace=>["org/jruby/RubyString.java:3101:ingsub'",
    "org/jruby/RubyString.java:3069:in gsub'",
    "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:317:in
    gsub_dynamic_fields'",
    "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:308:in
    gsub'", "org/jruby/RubyArray.java:1613:ineach'",
    "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:290:in
    gsub'",
    "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-mutate-3.1.3/lib/logstash/filters/mutate.rb:207:in
    filter'",
    "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in
    do_filter'",
    "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in
    multi_filter'", "org/jruby/RubyArray.java:1613:ineach'",
    "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in
    multi_filter'",
    "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:41:in
    multi_filter'", "(eval):4135:ininitialize'",
    "org/jruby/RubyArray.java:1613:in each'", "(eval):4131:in
    initialize'", "org/jruby/RubyProc.java:281:incall'", "(eval):997:in
    filter_func'",
    "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:295:in
    filter_batch'", "org/jruby/RubyProc.java:281:incall'",
    "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:192:in
    each'", "org/jruby/RubyHash.java:1342:ineach'",
    "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:191:in
    each'",
    "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in
    filter_batch'",
    "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:282:in
    worker_loop'",
    "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in
    start_workers'"]}

UPDATE:

I got the error message again and this time I got the error message after {"exception"=>#,:

{"exception"=>#<InterruptedRegexpError: Regexp Interrupted>

Hi,

I'm now also getting the following error, which seems to be related to the previous errors:

12:23:01.630 [[main]>worker1] ERROR logstash.pipeline - Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>java.lang.OutOfMemoryError: Java heap space, "backtrace"=>["org.joni.StackMachine.ensure1(StackMachine.java:98)", "org.joni.StackMachine.push(StackMachine.java:162)", "org.joni.StackMachine.pushAlt(StackMachine.java:200)", "org.joni.ByteCodeMachine.opAnyCharMLStar(ByteCodeMachine.java:901)", "org.joni.ByteCodeMachine.matchAt(ByteCodeMachine.java:231)", "org.joni.Matcher.matchCheck(Matcher.java:304)", "org.joni.Matcher.searchInterruptible(Matcher.java:457)", "org.jruby.RubyRegexp$SearchMatchTask.run(RubyRegexp.java:273)", "org.jruby.RubyThread.executeBlockingTask(RubyThread.java:1066)", "org.jruby.RubyRegexp.matcherSearch(RubyRegexp.java:235)", "org.jruby.RubyRegexp.search19(RubyRegexp.java:1780)", "org.jruby.RubyRegexp.matchPos(RubyRegexp.java:1720)", "org.jruby.RubyRegexp.match19Common(RubyRegexp.java:1701)", "org.jruby.RubyRegexp.match_m19(RubyRegexp.java:1680)", "org.jruby.RubyRegexp$INVOKER$i$match_m19.call(RubyRegexp$INVOKER$i$match_m19.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodOneOrNBlock.call(JavaMethod.java:350)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "rubyjit.Grok$$execute_53328a415db88c422bae07913b18780ed48cfbe61442407170.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:182)", "rubyjit.Grok$$execute_53328a415db88c422bae07913b18780ed48cfbe61442407170.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "rubyjit.LogStash::Filters::Grok$$match_against_groks_14ba492e5283494c850c28f08411346110f357fa1442407170.block_1$RUBY$file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok.rb:334)", "rubyjit$LogStash::Filters::Grok$$match_against_groks_14ba492e5283494c850c28f08411346110f357fa1442407170$block_1$RUBY$file.call(rubyjit$LogStash::Filters::Grok$$match_against_groks_14ba492e5283494c850c28f08411346110f357fa1442407170$block_1$RUBY$file)", "org.jruby.runtime.CompiledBlock19.yieldSpecificInternal(CompiledBlock19.java:117)", "org.jruby.runtime.CompiledBlock19.yieldSpecific(CompiledBlock19.java:92)", "org.jruby.runtime.Block.yieldSpecific(Block.java:111)", "rubyjit.LogStash::Filters::Grok::TimeoutEnforcer$$grok_till_timeout_5eb2910e012158b714ca47e7c36e45555f4885b51442407170.chained_1_rescue_1$RUBY$SYNTHETIC__file__(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok/timeout_enforcer.rb:21)", "rubyjit.LogStash::Filters::Grok::TimeoutEnforcer$$grok_till_timeout_5eb2910e012158b714ca47e7c36e45555f4885b51442407170.chained_0_ensure_1$RUBY$ensure(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok/timeout_enforcer.rb)", "rubyjit.LogStash::Filters::Grok::TimeoutEnforcer$$grok_till_timeout_5eb2910e012158b714ca47e7c36e45555f4885b51442407170.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok/timeout_enforcer.rb:18)", "rubyjit.LogStash::Filters::Grok::TimeoutEnforcer$$grok_till_timeout_5eb2910e012158b714ca47e7c36e45555f4885b51442407170.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.0/lib/logstash/filters/grok/timeout_enforcer.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:101)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:79)"]}
Error: Your application used more memory than the safety cap of 2G.
Specify -J-Xmx####m to increase it (#### = cap size in MB).
Specify -w for full OutOfMemoryError stack trace

Is there something wrong with the gsub command in the logstash config? Does it take a lot of memory? Is there another alternative to this?

gsub => ["timestamp", " GMT\+05\:30", ""]

What I'm doing here is removing the timezone part from the timestamp because I dont want the offset to be applied.

Thanks

This is weird. It looks like gsub is going into an infinite loop or something but I can't see why. The regexp is totally legit.

Hi,

Thanks for the reply.

Is there any other way to achieve the same result?

All I want is for logstash to ignore the timezone or is there a solution where I can keep the timezone and correct the offset done by the timezone?

Is there any other way to achieve the same result?

You could try a ruby filter.

All I want is for logstash to ignore the timezone or is there a solution where I can keep the timezone and correct the offset done by the timezone?

Why do you want Logstash to ignore the timezone? Date parsing is easier when you have the timezone.

Aren't you using grok to parse the log entry? If so, why not just adjust the grok expression so you don't capture the timezone into the timestamp field?

Hi,

I need to ignore timezone because there is an error in the time settings.

For ex: the timestamp is : [6/19/17 23:12:24:144 GMT+05:30]. The grok pattern I used to capture this is: \[%{DATA:timestamp}\]. Just for reference, the entire log event is:

[6/19/17 23:12:24:144 GMT+05:30] 00000121 SystemErr R From SignOn.jsp >> RequiredFlag >> true

The entire grok pattern I used is:

\[%{DATA:timestamp}\] %{BASE16NUM:threadID} (?<shortname>\b[A-Za-z0-9\$]{2,}\b)%{SPACE}%{WORD:loglevel}%{SPACE} %{GREEDYDATA:message}

Please tell me how I can adjust the grok pattern to ignore the timezone. I'm still learning how to use logstash and I couldn't figure it out.

Your help is really appreciated.

Thanks

The grok constructor web site is a useful tool to learn how to build grok expressions.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.