Hello,
I am using LogStash with the logstash-codec-avro_schema_registry codec.
However the encoding segment is not implemented yet and I need to implement it myself.
Here is my LogStash configuration (POC):
input {
file {
path => '/tmp/logstash_input'
}
}
output {
kafka {
topic_id => 'test'
codec => avro_schema_registry {
endpoint => 'http://localhost:2181'
}
}
}
Every line in the input file which will get processed by LogStash should include the full schema and the payload.
My goal in implementing the encode method is to extract the schema from the LogStash event and then use it to communicate with the schema registry and get a schema_id from it.
Once I have the schema_id, I will replace the schema with the schema_id and this will get sent to Kafka.
Does anyone know if this can be done and if how?
I am fairly new to ruby/LogStash/AVRO
Thank you,
Nir