File input and avro codec

All,
I am having some issue with getting file input and avro codec to work together.
The idea is to read an avro file into a json object and then insert it into Elasticsearch.

Below is my config file.
input {
file {
path => "C:/dev/testData/xxx/weather.avro"
start_position => "beginning"
codec => avro {
schema_uri => "C:/dev/testData/xxx/weather.avsc"
}

sincedb_path=>"C:/dev/testData/property/db"

}
}

output {

stdout { codec => rubydebug }

}

Here is the log:

2018-10-18T13:45:09,523][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2018-10-18T13:45:09,599][DEBUG][logstash.inputs.file ] Received line {:path=>"C:/dev/testData/chewer/weather.avro", :text=>"Obj\x01\x04\x14avro.codec\bnull\x16avro.schema\xF2\x02{"type":"record","name":"Weather","namespace":"test","fields":[{"name":"station","type":"string"},{"name":"time","type":"long"},{"name":"temp","type":"int"}],"doc":"A weather reading."}\x00\xB0\x81\xB3\xC4"}
[2018-10-18T13:45:10,155][ERROR][filewatch.tailmode.handlers.grow] read_to_eof: general error reading C:/dev/testData/chewer/weather.avro {"error"=>"#<ArgumentError: negative length -40 given>", "backtrace"=>["org/jruby/ext/stringio/StringIO.java:788:in read'", "C:/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:106:inread'", "C:/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:93:in read_bytes'", "C:/logstash-6.4.0/vendor/bundle/jruby/2.3.0/gems/avro-1.8.2/lib/avro/io.rb:99:inread_string'"]}
The test files are from Avro's own repo (weather.avro)

I use avro 1.8.2 and logstash-codec-avro-3.2.3-java.
Please advise.
Gordon

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.