Avro SchemaParseError

Hi, I'm using logstash to load logs from Kafka to Elastic. Running logstash on openshift. I'm receiving the below error when trying to read the logs from the kafka topic. Can anyone help me figure what I'm missing here.

Error:

[2019-07-16T14:01:46,777][FATAL][logstash.runner ]
An unexpected error occurred! {:error=>#<Avro::SchemaParseError: Error validating default for meta: at . expected type map, got null>,
:backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:403:in validate_default!'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:377:ininitialize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:216:in block in make_field_objects'", "org/jruby/RubyArray.java:1734:ineach'", "org/jruby/RubyEnumerable.java:1067:in each_with_index'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:209:inmake_field_objects'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:240:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:72:inreal_parse'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:38:in parse'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro_schema_registry-1.1.1/lib/logstash/codecs/avro_schema_registry.rb:158:inget_schema'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro_schema_registry-1.1.1/lib/logstash/codecs/avro_schema_registry.rb:228:in decode'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.3.1/lib/logstash/inputs/kafka.rb:258:inblock in thread_runner'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.3.1/lib/logstash/inputs/kafka.rb:257:in `block in thread_runner'"]}

Config:

input
{
kafka
{
bootstrap_servers => "{KAFKA_HOST}" topics => ["{KAFKA_TOPIC}"]
security_protocol => "SSL"
ssl_truststore_location => "{KAFKA_TRUSTSTORE}" ssl_truststore_password => "{KAFKA_TRUSTSTORE_PWD}"
ssl_keystore_location => "{KAFKA_KEYSTORE}" ssl_keystore_password => "{KAFKA_KEYSTORE_PWD}"
ssl_key_password => "{KAFKA_KEY_PWD}" consumer_threads => "10" decorate_events => true codec => avro_schema_registry { endpoint => "{KAFKA_SCHEMAREG}"
}
value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"

}

}

I ran into the same issue just now. For me, it was a bad avro schema that looked something like:
{ "name": "first", "type": ["null", "boolean"] , "default": false}

According to this avro documentation, default values for unions depend on the first field in the union. So my default value of false didn't match null type, causing this issue.

If you still need help with this, could you post your avro schema?

2 Likes

Thanks for the response. It was the same issue. I fixed the schema and it worked fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.