Logstash 6.2.2: Empty square brackets in json field names crash the pipeline

Empty square brackets in field names causes the Logstash pipeline to crash with following error:

Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.

Here's a Logstash configuration file to reproduce the issue. The issue exists on Logstash 6.2.4 as well. We started seeing the issue after migrating to Logstash 6.2.2 from an older 2.x release.

input { 
  generator {
    lines => [
      '{"foo":"a", "[]":""}',
      '{"foo":"b", "[":""}',
      '{"foo":"c", "]":""}',
      '{"foo":"d", "bar":"x"}',
      '{"foo":"e", "bar":"y"}'
    ]
    count => 1
  }
}
filter {
  json {
    source => "message"
  }
}
output { 
  stdout {
     codec => rubydebug
  }
} 

I get the following exception on the stdout:

Exception in thread "Ruby-0-Thread-8@[main]>worker0: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:385" java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.elementData(ArrayList.java:422)
at java.util.ArrayList.remove(ArrayList.java:499)
at org.logstash.FieldReference.parse(FieldReference.java:167)
at org.logstash.FieldReference.parseToCache(FieldReference.java:142)
at org.logstash.FieldReference.from(FieldReference.java:74)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:88)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.call(JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.gen)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:193)
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:323)
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73)
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132)
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148)
at org.jruby.runtime.IRBlockBody.doYield(IRBlockBody.java:186)
at org.jruby.runtime.BlockBody.yield(BlockBody.java:116)
at org.jruby.runtime.Block.yield(Block.java:165)
at org.jruby.RubyHash$12.visit(RubyHash.java:1362)
at org.jruby.RubyHash$12.visit(RubyHash.java:1359)
at org.jruby.RubyHash.visitLimited(RubyHash.java:662)
at org.jruby.RubyHash.visitAll(RubyHash.java:647)
at org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1319)
at org.jruby.RubyHash.each_pairCommon(RubyHash.java:1354)
at org.jruby.RubyHash.each(RubyHash.java:1343)
at org.jruby.RubyHash$INVOKER$i$0$0$each.call(RubyHash$INVOKER$i$0$0$each.gen)
at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:498)
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:298)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:79)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:83)
at org.jruby.ir.instructions.CallBase.interpret(CallBase.java:428)
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:355)

...

FYI in 6.6 the error message changes to

"exception"=>"Invalid FieldReference: `[]`" [...]

for all three variants.

Hi Badger, Thanks for the reply. Is there a fix available for the 6.2 branch? I'm also interested in good workarounds in the meantime. My current workaround of checking the payload for offending fields before calling json parser seems to work but I'm worried that the regex check is inefficient on large payloads with high traffic pipeline.

It's not fixed in 6.6, it just crashes with a different error message :frowning: So there is no fix anywhere yet.

FYI a json codec does not have the same issue, so if you are pulling lines like that from an input that might give you another way to build a workaround.

Thanks for the workaround. Our input is Avro encoded message with one of the fields containing json encoded string. We use Avro codec with input and use json parser to parse the json string. I think we cannot use json codec with input.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.