Logstash error

Hello team,
I am getting following error on logstash pipeline. can you please help me to identify exact issue.

Logstash version: 7.9

C:\Users\mangeshsuresh.jadhav\Downloads\EKL\logstash-7.9.0>bin\logstash -f wubsprod.conf
'findstr' is not recognized as an internal or external command,
operable program or batch file.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/C:/Users/MANGES~1.JAD/AppData/Local/Temp/jruby-15652/jruby2930671875986759032jopenssl.jar) to field java.security.MessageDigest.provider
WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Sending Logstash logs to C:/Users/mangeshsuresh.jadhav/Downloads/EKL/logstash-7.9.0/logs which is now configured via log4j2.properties
[2021-06-27T13:06:49,153][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.0", "jruby.version"=>"jruby 9.2.12.0 (2.5.7) 2020-07-01 db01a49ba6 OpenJDK 64-Bit Server VM 14.0.1+7 on 14.0.1+7 +jit [mswin32-x86_64]"}
[2021-06-27T13:06:49,423][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-27T13:06:51,242][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::ClassCastException", :message=>"class org.jruby.specialized.RubyArrayTwoObject cannot be cast to class java.lang.String (org.jruby.specialized.RubyArrayTwoObject is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap')", :backtrace=>["org.logstash.config.ir.graph.PluginVertex.<init>(PluginVertex.java:37)", "org.logstash.config.ir.imperative.PluginStatement.toGraph(PluginStatement.java:57)", "org.logstash.config.ir.imperative.ComposedSequenceStatement.toGraph(ComposedSequenceStatement.java:44)", "org.logstash.config.ir.imperative.IfStatement.toGraph(IfStatement.java:103)", "org.logstash.config.ir.imperative.ComposedSequenceStatement.toGraph(ComposedSequenceStatement.java:44)", "org.logstash.config.ir.imperative.IfStatement.toGraph(IfStatement.java:103)", "org.logstash.config.ir.ConfigCompiler.toGraphWithUntypedException(ConfigCompiler.java:119)", "org.logstash.config.ir.ConfigCompiler.lambda$compileGraph$3(ConfigCompiler.java:114)", "java.base/java.util.stream.Collectors.lambda$uniqKeysMapAccumulator$1(Collectors.java:178)", "java.base/java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169)", "java.base/java.util.HashMap$EntrySpliterator.forEachRemaining(HashMap.java:1837)", "java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)", "java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)", "java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)", "java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)", "java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)", "org.logstash.config.ir.ConfigCompiler.compileGraph(ConfigCompiler.java:114)", "org.logstash.config.ir.ConfigCompiler.lambda$compileSources$0(ConfigCompiler.java:63)", "java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)", "java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)", "java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)", "java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)", "java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)", "java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)", "java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)", "org.logstash.config.ir.ConfigCompiler.compileSources(ConfigCompiler.java:66)", "org.logstash.config.ir.ConfigCompiler.configToPipelineIR(ConfigCompiler.java:58)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:183)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:69)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:84)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:86)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:73)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.RubyClass.newInstance(RubyClass.java:939)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:332)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:86)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:552)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:361)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:92)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:191)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:178)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:208)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:396)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:205)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:325)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:72)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:60)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.base/java.lang.Thread.run(Thread.java:832)"]}
warning: thread "Converge PipelineAction::Create<main>" terminated with exception (report_on_exception is true):
LogStash::Error: Don't know how to handle `Java::JavaLang::ClassCastException` for `PipelineAction::Create<main>`
          create at org/logstash/execution/ConvergeResultExt.java:129
             add at org/logstash/execution/ConvergeResultExt.java:57
  converge_state at C:/Users/mangeshsuresh.jadhav/Downloads/EKL/logstash-7.9.0/logstash-core/lib/logstash/agent.rb:370
[2021-06-27T13:06:51,298][ERROR][logstash.agent           ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::JavaLang::ClassCastException` for `PipelineAction::Create<main>`"}
[2021-06-27T13:06:51,338][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Error: Don't know how to handle `Java::JavaLang::ClassCastException` for `PipelineAction::Create<main>`>, :backtrace=>["org/logstash/execution/ConvergeResultExt.java:129:in `create'", "org/logstash/execution/ConvergeResultExt.java:57:in `add'", "C:/Users/mangeshsuresh.jadhav/Downloads/EKL/logstash-7.9.0/logstash-core/lib/logstash/agent.rb:370:in `block in converge_state'"]}
[2021-06-27T13:06:51,355][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

C:\Users\mangeshsuresh.jadhav\Downloads\EKL\logstash-7.9.0>

Logstash Config:

input { beats { port => 5044 } }
filter {
############################################ VMS ###################################################

if "xyz" in [labels][app_source] and "prod" in [labels][environment] {
         if "FIXLog" in [labels][kind] {
	     if "IN" in [message]{
	        grok { 
		       match => { "message" => "^%{GREEDYDATA:ResponseType}\s%{GREEDYDATA:ResponseTime}\s$" }
			 timeout_millis => 10000
		     id => "FixGateway-FIXLogIN"
		    }
		 }
		 else if "OUT" in [message] {
		    grok { 
		       match => { "message" => "^%{GREEDYDATA:RequestType}\s%{GREEDYDATA:RequestTime}\s$" }
				timeout_millis => 10000
		     id => "FixGateway-FIXLogOUT"
		    }
		 }
		 mutate {
			gsub => [
			  "message", "", "|"
			]
		 }		 
		 kv {
		   source => message
		   field_split_pattern => " |\|"
		   id => "wubs-qa-FixGateway_1"
		   timeout_millis => 10000
		     id => "FixGateway-FIXLogkv"
		   whitespace => strict
		 }	
	    }
		else if "APILog" in [labels][kind] and "Error" in [message] {
	     grok { 
		    match => { "message" => "^\[%{DATA}\]\s*\[%{DATA}\s%{WORD:APILogLevel}\]\s*\n\s%{GREEDYDATA:ErrorMessage}\s$" }
			match => { "message" => "^(?m)\[%{DATA}\]\s*%{GREEDYDATA:logmessage}$" }
			timeout_millis => 10000
		     id => "FixGateway-APILogError"
		 }
	    }
	   	else if "APILog" in [labels][kind] and "Information" in [message] and "received" in [message] {
         grok { 
		     match => { "message" => "^(?m)\[%{DATA}\]\s*\[%{DATA}\s%{WORD:APILogLevel}\]\s*%{WORD:QuoteRequestType}\s*%{WORD:APILogType}\s*:\s*%{GREEDYDATA:QuoteRequestJson}\s*received at %{MONTHNUM:monthReq}[/-]%{MONTHDAY:dayReq}[/-]%{YEAR:yearReq} %{NUMBER:hourreq}:%{MINUTE:minreq}:%{NUMBER:secreq}$" }
			 match => { "message" => "^(?m)\[%{DATA}\]\s*%{GREEDYDATA:logmessage}$" }
			 timeout_millis => 10000
		     id => "FixGateway-APILogInfo"
         }
		 json {
	        source => "QuoteRequestJson"
	     }
		 mutate {
            add_field => {"QuoteRequestTime" => "%{yearReq} %{monthReq} %{dayReq} %{hourreq}:%{minreq}:%{secreq}"}
         }
         date {
            match => [ "QuoteRequestTime","yyyy MM dd hh:mm:ss" ]
            target => "QuoteRequestTime" 
         }		 
       }
	   else if [labels][kind] == "APILog" and "Information" in [message] and "sent" in [message] {
         grok { 
		     match => { "message" => "^(?m)\[%{DATA}\]\s*\[%{DATA}\s%{WORD:APILogLevel}\]\s*%{WORD:QuoteResponseType}\s*%{WORD:APILogType}\s*:\s*%{GREEDYDATA:QuoteResponseJson}\s*sent at %{MONTHNUM:monthRes}[/-]%{MONTHDAY:dayRes}[/-]%{YEAR:yearRes} %{NUMBER:hourres}:%{NUMBER:minres}:%{NUMBER:secres}$" }
			 match => { "message" => "^(?m)\[%{DATA}\]\s*%{GREEDYDATA:logmessage}$" }
			 timeout_millis => 10000
		     id => "FixGateway-APILogInfosent"
         }
		 json {
	        source => "QuoteResponseJson"
	     }
		 mutate {
            add_field => {"QuoteResponseTime" => "%{yearRes} %{monthRes} %{dayRes} %{hourres}:%{minres}:%{secres}"}
         }
         date {
            match => [ "QuoteResponseTime","yyyy MM dd hh:mm:ss" ]
            target => "QuoteResponseTime" 
			
         }

	    }		 
	    if [ResponseId]{
		  elasticsearch {
		    hosts => ["localhost:9200"]
            index => "wubs-xyz-*"
			add_tag => [ "APILog_ReqRes_prod" ]
			query => "RequestId:%{ResponseId} AND @timestamp:[now-5m/d TO now/d]"
			fields => { "QuoteRequestTime" => "QuoteRequestTime" }
		  }
		}
		mutate { add_field => { "[@metadata][target_index]" => "wubs-xyz-write-alias" } }
    }


###############################################################################################

}

###############################################################################################

output {
if ( [labels][app_source] in [ "abc", "xyz" ] and "prod" in [labels][environment]) {
   elasticsearch {
      hosts => ["localhost:9200"]
      index => "%{[@metadata][target_index]}"
   }
  }
  }

class org.jruby.specialized.RubyArrayTwoObject cannot be cast to class java.lang.String

Wow, that's a horrible error message. The problem is that you have two id options on your kv filter. These get combined into an array of strings, and the option expects a string, not an array of them. This configuration produces the same error message.

input { generator { count => 1 lines => [ '' ] } }
output { stdout { codec => rubydebug { metadata => false } } }
filter { kv { id => "a" id => "b" } }
1 Like

Ohhh. Thank you so much your help. I deleted these I'd now it started working. great

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.