Hi, I'm using logstash to load logs from Kafka to Elastic. Running logstash on openshift. I'm receiving the below error when trying to read the logs from the kafka topic. Can anyone help me figure what I'm missing here.
Error:
[2019-07-16T14:01:46,777][FATAL][logstash.runner ]
An unexpected error occurred! {:error=>#<Avro::SchemaParseError: Error validating default for meta: at . expected type map, got null>,
:backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:403:in validate_default!'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:377:in
initialize'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:216:in block in make_field_objects'", "org/jruby/RubyArray.java:1734:in
each'", "org/jruby/RubyEnumerable.java:1067:in each_with_index'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:209:in
make_field_objects'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:240:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:72:in
real_parse'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/avro-1.9.0/lib/avro/schema.rb:38:in parse'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro_schema_registry-1.1.1/lib/logstash/codecs/avro_schema_registry.rb:158:in
get_schema'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro_schema_registry-1.1.1/lib/logstash/codecs/avro_schema_registry.rb:228:in decode'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.3.1/lib/logstash/inputs/kafka.rb:258:in
block in thread_runner'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-kafka-8.3.1/lib/logstash/inputs/kafka.rb:257:in `block in thread_runner'"]}
Config:
input
{
kafka
{
bootstrap_servers => "{KAFKA_HOST}"
topics => ["{KAFKA_TOPIC}"]
security_protocol => "SSL"
ssl_truststore_location => "{KAFKA_TRUSTSTORE}"
ssl_truststore_password => "{KAFKA_TRUSTSTORE_PWD}"
ssl_keystore_location => "{KAFKA_KEYSTORE}"
ssl_keystore_password => "{KAFKA_KEYSTORE_PWD}"
ssl_key_password => "{KAFKA_KEY_PWD}"
consumer_threads => "10"
decorate_events => true
codec => avro_schema_registry {
endpoint => "{KAFKA_SCHEMAREG}"
}
value_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
}
}