Error unable to find valid certification path


(Roger) #1

Hi,
I'm trying to configure TLS between Logstash and Elastic.
Currently I have configured (correctly, it seems), TLS on my cluster (3 nodes) and I can access to it using Kibana.
In my Logstash (installed in localhost on each servers) I'm using elasticsearch output filter but when I start it I get only the error:

[2018-03-02T18:59:24,125][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://elastic:xxxxxx@localhost:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://elastic:xxxxxx@localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2018-03-02T18:59:25,400][INFO ][logstash.pipeline        ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0xae84822@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246

while on elasticsearch logs I get:

[2018-03-02T18:59:24,124][WARN ][o.e.x.s.t.n.SecurityNetty4HttpServerTransport] [elk1] http client did not trust this server's certificate, closing connection [id: 0x3612b68c, L:0.0.0.0/0.0.0.0:9200 ! R:/127.0.0.1:47433]

here you can find a piece of my logstash.conf

elasticsearch {
                        hosts => ["https://localhost:9200/"]
                        user => "elastic"
                        password => "changeme"
                        index => "roger-%{+YYYY.MM.dd}"
                        template_name => "roger-*"
                        template => "/etc/logstash/templates/roger-template-es5x.json"
                        cacert => "/etc/logstash/certs/caC.cer"
                        ssl => true
                        ssl_certificate_verification => false
                }

can you help me?
I know that ssl should not be mandatory since I specify https in hosts key and I know also that should not be used ssl_certificate_verification.

thanks in advance


(Roger) #2

just an update.
I have fixed it adding the CA to Java keystore.
here the used commands:

echo -n | openssl s_client -connect localhost:9200 | sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' > ./ca_logstash.cer
keytool -import -alias saelk -file ca_logstash.cer -keystore /usr/lib/jvm/jre-1.8.0-openjdk-1.8.0.65-3.b17.el7.x86_64/lib/security/cacerts

(Sa Jain) #3

Hi Roger,
I am facing a similar problem, can you help pls

I have used Search guard, generated the demo certificates.
when i use the root-ca.pem that search guard has generated in Logstash.
It does not work :frowning:

My logstash and elastic are on different servers

output {
elasticsearch {
hosts => "10.16.11.172:9200"
user => "admin"
password => "admin"
ssl => true
ssl_certificate_verification => false
cacert => "/usr/share/logstash/root-ca.pem"
template_name => "ocean-logs-*"
index => "%{index_name}-%{+YYYY.MM.dd}"
}
}


(Roger) #4

I'm not sure if we can talk about Search Guard here. anyway, on another cluster I have used it.
taking a look to your output section, you should have

hosts => ["https://10.16.11.172:9200"]


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.


(Mark Walkom) #6