Hi,
I'm trying to consume events from Kafka with Logstash to send them to Elastic.
I'm currently using a pipeline with the following input:
kafka {
type => "standard_event"
bootstrap_servers => "xxx01:9092,xxx02:9092,xxx03:9092,xxx04:9092"
topics => ["something"]
group_id => "somehow"
decorate_events => true
consumer_threads => 10
key_deserializer_class => "org.apache.kafka.connect.storage.StringConverter"
value_deserializer_class => "io.confluent.connect.avro.AvroConverter"
codec => avro {
schema_uri => "/root/schema.avsc"
tag_on_failure => true
}
}
I'm trying to use different classes in order to deserialize events.
I put those JARs under /usr/share/logstash/lib/ but, somehow, they are not found by Logstash when I launch it.
I get the following error:
Error: uncaught throw org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.connect.storage.StringConverter for configuration key.deserializer: Class org.apache.kafka.connect.storage.StringConverter could not be found.
Exception: UncaughtThrowError
Stack: org/jruby/RubyKernel.java:1137:in `throw'
Does anyone know where should I put those JARs so that Logstash can use them?