Kafka Input - avro https schema_registry_url: unable to find valid certification path to requested target

Hi friends,

Logstash version is 7.11.1

I am having trouble consuming messages from a kafka that uses confluent avro. I can't seem to connect to an https schema registry.

Here is my config :

input {
  kafka {
        bootstrap_servers => "mybroker.kafka:9094"
        group_id => "my_group"
        client_id => "my_client"
        topics => "mytopic"
        schema_registry_url => "https://my.registry:8081/"
        schema_registry_key => "myuser"
        schema_registry_secret => "mypassword"
        codec => "json"
        decorate_events => true
        auto_offset_reset => "earliest"
        jaas_path => "/path/to/kafka-jaas-input.conf"
        sasl_mechanism => "SCRAM-SHA-512"
        security_protocol => "SASL_SSL"
        ssl_truststore_location => "/path/to/certs/my.truststore.jks"
        ssl_truststore_password => "REDACTED"
        key_deserializer_class => "org.apache.kafka.common.serialization.ByteArrayDeserializer"
    }
}

I get the following error message :

Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Schema registry service doesn't respond, error: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>

I have tried with a schema_registry_url that does not use https and it works. Kafka uses SASL_SSL and my secured connection works. Both Kafka and the registry have the same CA from what I can tell.
My hypothesis is that for some reason, the registry http connection does not inherit the truststore information. From my very limited understanding of java, it's not the same class, so that would make sense.

Is there any way to either:

  1. inherit the truststore from the kafka class
  2. Bypass SSL verification altogether for the schema registry.

I might be completely wrong and it might just be a cert issue but I don't know how I can verify if it could be the problem or not.

Thank you.

I decided to use HaProxy as an SSL termination to bypass the security problem. I then was able to get to the following error message :

org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition -0 at offset 7104. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro unknown schema for id 100769
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401

So my investigations have led me to this topic on SO:
https://stackoverflow.com/questions/67586950/can-not-connect-logstash-to-confluent-schema-registry

Which is exactly what my problem is. I tried adding the auth header into my HaProxy Configuration and it worked!

To anyone trying to consume avro with a schema registry, just know that as of today, logstash is incapable of managing HTTPS with a truststore or basic authentication.

The SO post mentions using the forked codec available here I will try it and see if it fixes my problems.

For reference, here is my Haproxy configuration used.

frontend registry
    mode http
    bind :80
    default_backend servers


backend servers
    mode http
         http-request set-header Authorization  Basic\ <base64_of_user:password>
    server srv1 my.registry:8081 ssl verify none #Don't do this in production

I believe this deserves raising an issue on the kafka input plugin's github.

Edit : here it is

It is unclear to me whether this issue/PR from last week addresses this.

1 Like

Hi Badger,

Yes, that seems to be a likely fix for my issue. This is what I was also told on Github.

I will update the plugin on my logstash 7.11.1 and see if it works. If it doesn't I'll repeat the plugin upgrade on 7.13.3 (The released version of logstash-integration-kafka for 7.13.3 seems too old to incorporate that fix).

Thank you.

From what I can tell, we're halfway there.
Basic auth now works with logstash-integration-kafka v10.8.1 (latest as of today) but it still won't work while using https.

I no longer need to pass the basic auth info directly in haproxy but I still need to use it to terminate TLS encryption and send it over HTTP.

New issue on github

Can't edit my previous post anymore, so for those of you trying to use https with avro schema_registry_url please know that it is most likely not working as of V 10.8.1 of the logstash-integration-kafka plugin.

I am trying to get this reviewed and tested by someone else so we can confirm 100% that this is indeed a bug in the plugin.

If you believe you can help by setting up an avro schema-registry using https and confirming you have the same problem, it would be greaty appreciated.

I am marking the previous post as the solution for the problem and I invite anyone in the future to take a look at the github issue to see how things have evolved.

Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.