Logstash Pipeline getting Terminated due to avro format of kakfka topic

Hi,
I have a topic in kafka which has messages in avro format, pasting a part of message below

 ..

We are using this avro schema format, pasting a part of schema

	{
	  "type" : "record",
	  "name" : "ProductDetailsSchema",
	  "namespace" : "xxx.product.schema.avro",
	  "fields" : [ {
		"name" : "products",
		"type" : {
		  "type" : "record",
		  "name" : "products",
		  "fields" : [ {
			"name" : "product",
			"type" : {
			  "type" : "record",
			  "name" : "product",
			  "fields" : [ {
				"name" : "code",
				"type" : "string"
			  }, 

This is the logstash configuration file kafka input

input {
  kafka{
      bootstrap_servers => "xx.xx.xx.xx:9092"
      topics => "ProductsTopic-QA1"
      key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
      value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
      codec => avro {
        schema_uri => "/tmp/ProductsAvro.avsc"
    }
}
}

after restarting pipeline for this topic I am getting below error and pipeline is getting terminated

org.jruby.exceptions.ArgumentError: (ArgumentError) negative length -11 given
	at org.jruby.ext.stringio.StringIO.read(org/jruby/ext/stringio/StringIO.java:851) ~[jruby.jar:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:106) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_bytes(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:93) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_string(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:99) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_data(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:276) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:364) ~[?:?]
	at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821) ~[jruby.jar:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:361) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_data(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:287) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:364) ~[?:?]
	at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821) ~[jruby.jar:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:361) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_data(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:287) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:364) ~[?:?]
	at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1821) ~[jruby.jar:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:361) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read_data(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:287) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.avro_minus_1_dot_10_dot_2.lib.avro.io.read(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/avro-1.10.2/lib/avro/io.rb:252) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_codec_minus_avro_minus_3_dot_4_dot_0_minus_java.lib.logstash.codecs.avro.decode(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-avro-3.4.0-java/lib/logstash/codecs/avro.rb:110) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_kafka_minus_10_dot_12_dot_0_minus_java.lib.logstash.inputs.kafka.handle_record(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-kafka-10.12.0-java/lib/logstash/inputs/kafka.rb:346) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_kafka_minus_10_dot_12_dot_0_minus_java.lib.logstash.inputs.kafka.thread_runner(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-kafka-10.12.0-java/lib/logstash/inputs/kafka.rb:319) ~[?:?]
	at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_integration_minus_kafka_minus_10_dot_12_dot_0_minus_java.lib.logstash.inputs.kafka.thread_runner(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-integration-kafka-10.12.0-java/lib/logstash/inputs/kafka.rb:319) ~[?:?]
[2022-10-11T05:47:39,725][INFO ][logstash.javapipeline    ][pcmkafka-product-events] Pipeline terminated {"pipeline.id"=>"pcmkafka-product-events"}

can anyone please suggest any solution for this ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.