Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: Schema registry service doesn't respond, error: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target>
I have tried with a schema_registry_url that does not use https and it works. Kafka uses SASL_SSL and my secured connection works. Both Kafka and the registry have the same CA from what I can tell.
My hypothesis is that for some reason, the registry http connection does not inherit the truststore information. From my very limited understanding of java, it's not the same class, so that would make sense.
Is there any way to either:
inherit the truststore from the kafka class
Bypass SSL verification altogether for the schema registry.
I might be completely wrong and it might just be a cert issue but I don't know how I can verify if it could be the problem or not.
I decided to use HaProxy as an SSL termination to bypass the security problem. I then was able to get to the following error message :
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition -0 at offset 7104. If needed, please seek past the record to continue consumption.
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro unknown schema for id 100769
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unauthorized; error code: 401
Which is exactly what my problem is. I tried adding the auth header into my HaProxy Configuration and it worked!
To anyone trying to consume avro with a schema registry, just know that as of today, logstash is incapable of managing HTTPS with a truststore or basic authentication.
The SO post mentions using the forked codec available here I will try it and see if it fixes my problems.
For reference, here is my Haproxy configuration used.
frontend registry
mode http
bind :80
default_backend servers
backend servers
mode http
http-request set-header Authorization Basic\ <base64_of_user:password>
server srv1 my.registry:8081 ssl verify none #Don't do this in production
I believe this deserves raising an issue on the kafka input plugin's github.
Yes, that seems to be a likely fix for my issue. This is what I was also told on Github.
I will update the plugin on my logstash 7.11.1 and see if it works. If it doesn't I'll repeat the plugin upgrade on 7.13.3 (The released version of logstash-integration-kafka for 7.13.3 seems too old to incorporate that fix).
From what I can tell, we're halfway there.
Basic auth now works with logstash-integration-kafka v10.8.1 (latest as of today) but it still won't work while using https.
I no longer need to pass the basic auth info directly in haproxy but I still need to use it to terminate TLS encryption and send it over HTTP.
Can't edit my previous post anymore, so for those of you trying to use https with avro schema_registry_url please know that it is most likely not working as of V 10.8.1 of the logstash-integration-kafka plugin.
I am trying to get this reviewed and tested by someone else so we can confirm 100% that this is indeed a bug in the plugin.
If you believe you can help by setting up an avro schema-registry using https and confirming you have the same problem, it would be greaty appreciated.
I am marking the previous post as the solution for the problem and I invite anyone in the future to take a look at the github issue to see how things have evolved.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.