Logstash crashes when kafka producer sends less fields than excepted

Hello
we are using logstash-kafka-integration plugin to read messages from a kafka topic using avro encoding
If my logstash consumer refers to a schema registry file that defines 2 optional fields but receive a message containing only one of these optional fields, logstash crashes
I would expect logstash to ignore the missing field (or, at least put null value in Elasticsearch)
Am i mis-understaning something ?

You will find a really simple example here :

Producer schema registry file

{
  "namespace": "fr.myorg.kafka.test",
  "type": "record",
  "name": "BasicTrace",
  "fields": [
    { "name": "id", "type": "string"}
  ]
}

Consumer schema registry file (logstash)

{
  "namespace": "fr.myorg.kafka.test",
  "type": "record",
  "name": "BasicTrace",
  "fields": [
    { "name": "id", "type": "string"},
    {"name": "testOpt", "type": ["null", "string"], "default":null}
  ]
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.