Logstash-codec-avro parsing wrong values

Hi,
I'm writing AVRO data to Kafka and then using logstash to put them into Elasticsearch. But all float values printed by logstash are wrong. To Kafka I'm sending following record

ProducerRecord(topic=weather, partition=0, headers=RecordHeaders(headers = [], isReadOnly = true), key=BN, value={"Temperature": 20.49, "Humidity": 56.0, "Pressure": 1011.0, "Wind": 1.0, "Cloudiness": 75.0}, timestamp=1562936770006)

In Kafka it is stored properly but logstash console output is

{
   "@version" => "1",
   "Pressure" => 23.718717575073242,
 "Cloudiness" => -0.09371757507324219,
       "Wind" => 56.0,
 "@timestamp" => 2019-07-12T13:06:10.012Z,
"Temperature" => 0.0,
"Humidity" => -0.09371758997440338
}

My logstash file is

input {
kafka {
codec => avro {
    	schema_uri => "./weather.avsc"
	}
    bootstrap_servers => "localhost:9092"
    topics => ["weather"]
}
}

output {
elasticsearch {
  hosts => ["localhost:9200"]
  index => "weather"
  workers => 1
}
stdout {}
}

and my schema weather.avsc

{
"namespace": "weather",
"type": "record",
"name": "WeatherMessage",
"fields": [
	{
		"name": "Temperature",
		"type": "float"
	},
	{
		"name": "Humidity",
		"type": "float"
	},
	{
		"name": "Pressure",
		"type": "float"
	},
	{
		"name": "Wind",
		"type": "float"
	},
	{
		"name": "Cloudiness",
		"type": "float"
	}
]
}

I'm using logstash 7.2. and io.confluent:kafka-avro-serializer:3.3.3 and 'org.apache.avro', name: 'avro-tools', version: '1.9.0'
Can you please give me some hits whats wrong and how to repair it?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.