Kafka (Confluent Platform) input - broken message encoding

Hi, I have a Confluent Platform (version 4.1.1). It is configured to read data from the database. The configuration for this is:

name = source-mysql-requests
connection.url = jdbc:mysql://localhost:3306/Requests
connector.class = io.confluent.connect.jdbc.JdbcSourceConnector
connection.user = ***
connection.password = ***
mode = incrementing
incrementing.column.name = ID
tasks.max = 5
topic.prefix = requests_
poll.interval.ms = 1000
batch.max.rows = 100
table.poll.interval.ms = 1000

I also have a Logstash (version 6.2.4) for reading the relevant Kafka topic. Here is its configuration:

 kafka { 
        bootstrap_servers => "localhost:9092" 
        topics => ["requests_Operation"] 
        add_field => { "[@metadata][flag]" => "operation" }
}
output { 
    if [@metadata][flag] == "operation" { 
        stdout { 
            codec => rubydebug 
        } 
    } 
}

When I run "kafka-avro-console-consumer" for the test, I get messages of this type:

{"ID":388625154,"ISSUER_ID":"8e427b6b-1176-4d4a-8090-915fedcef870","SERVICE_ID":"mercury-g2b.service:1.4","OPERATION":"prepareOutcomingConsignmentRequest","STATUS":"COMPLETED","RECEIVE_REQUEST_DATE":1525381951000,"PRODUCE_RESULT_DATE":1525381951000}

But in Logstash I have something terrible and unreadable:

"\u0000\u0000\u0000\u0000\u0001����\u0002Hfdebfb95-218a-11e2-a69b-b499babae7ea.mercury-g2b.service:1.4DprepareOutcomingConsignmentRequest\u0012COMPLETED���X���X"

What could go wrong?

Thanks for help, really help))

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.