Problem with Logstash SSL

hi, im having difficulties when changing my logstash SSL certificate, today i have a certificate that is in the /etc/logstash folder and i generated a new one through the elastic tool, using Elasticsearch-certutil and then Im generating the .crt through from openssl. The new certificate is already correctly named everywhere and with the proper permissions, same as the old certificate that works.

however, when i upload the new certificate, this error appears below in the logstash log

[2022-02-17T15:15:43,619][WARN ][logstash.outputs.Elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::Elasticsearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elastic:xxxxxx@localhost:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}

my configuration file is as follows (only the certificate snippet was pasted here)

output {
        if [type] == "host-1" {
                elasticsearch {
                        hosts => ["https://localhost:9200"]
                        #index => "cacheaudit-cloud1-%{+YYYY.MM.dd}"
                        user => "user"
                        password => "pass"
                        ssl => true
                        ssl_certificate_verification => false
                        cacert => "/etc/logstash/ca.crt"
                        ilm_rollover_alias => "cacheaudit-host1"
                        ilm_policy => "cache-lifecycle"
                }
        }

        if [type] == "host-2" {
                elasticsearch {
                        hosts => ["https://localhost:9200"]
                        #index => "cacheaudit-cloud2-%{+YYYY.MM.dd}"
                        user => "user"
                        password => "pass"
                        ssl => true
                        ssl_certificate_verification => false
                        cacert => "/etc/logstash/ca.crt"
                        ilm_rollover_alias => "cacheaudit-host2"
                        ilm_policy => "cache-lifecycle"
                }
        }
}

the new SSL certificate is valid until 2025

someone can help me with this issue?

@gustavoluza Did you make sure that the new certs are readable by logstash from a permission perspective?

@stephenb yes, old and new certificate belong to root group and root user in linux, with maximum access permission to this file and directory.

but i tried to change permissions for the new certificate, but i have the same issue at the logstash log

I would try a simple curl with the certificate to the Elasticsearch host and see if that works with the same connection information and cert.

thats just, curl https://instanceIP:9200 ?

See https://www.baeldung.com/linux/curl-https-connection

@stephenb curl with the new certificate

{"error":{"root_cause":[{"type":"security_exception","reason":"missing authentication credentials for REST request [/]","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","Bearer realm=\"security\"","ApiKey"]}}],"type":"security_exception","reason":"missing authentication credentials for REST request [/]","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","Bearer realm=\"security\"","ApiKey"]}},"status":401}

this error is expected because of my configurations and also occour with the older certificate

with the old certificate i also have the same result

someone can help?

If you use curl with the -u username:password option does it connect and return a result?

Have you tried to put the IP in instead of localhost in the Elasticsearch output?

Are there any other pertinent logs in the logstash logs?

I see you are using openssl to generate the CA why did you not just use the the Elasticsearch-certutil in ca mode (I am not clear what you did)

ca

Specifies to generate a new local certificate authority (CA). This parameter cannot be used with the csr or cert parameters.

@Badger any thoughts?

@stephenb how i can to generate a .zip file with p12, ca and crt inside? im trying to use the --multiple option, but i get some erros, can you give me a link with the sintax? (im also searching about this on the web, but without success till the moment)

EDIT: i generated the cert with --multiple option, but, in the zip file just came the .p12

What version of the stack are you using.

You will need to show the commands and in order you used otherwiseI / we can not help....

For the Elasticsearch https endpoint you should be following these instructions

The ca you should use for logstash should be the same you would use for kibana

Encrypt traffic between Kibana and Elasticsearch

When you ran the elasticsearch-certutil tool with the http option, it created a /kibana directory containing an elasticsearch-ca.pem

My stack version is 7.15.1

here are the commands that i used to generate the .zip file with .p12 cert inside.

./bin/elasticsearch-certutil cert --multiple elastic-stack-ca.p12

For Kibana i use a wildcard ca, that was not been generated with certutil, but this cert doesnt work with logstash

Right....

You need to do the second step.... following the instruction I just gave you ... the certs you created was for TLS internode not the http endpoint, you need to create the CA / Certs for the http endpoint

Why don't you try running with the http instruction per the instructions I just gave and then use that with logstash (instead kibana)

./bin/elasticsearch-certutil http

Ok...

now i have this files...
image
but logstash use a .crt file, how i can generate that file?

in the kibana directory that was been created, i have this
image

Please don't paste images

Looks like you did a csr request which is not correct

  1. When asked if you want to generate a CSR, enter n .

the Elasticsearch output for logstash is looking for a CA file not a cert.

BTW I wrote a How To here that shows all this

@stephenb sorry for post the images.

I did the procedure that you told me, but i still get a error from logstash.

[WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@localhost:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elastic:xxxxxx@localhost:9200/][Manticore::ClientProtocolException] PKIX path validation failed: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors"}

here is my logstash file with my hosts... (i use JDBC on input)

if [type] == "host-1" {
                elasticsearch {
                        hosts => ["https://localhost:9200"]
                        user => "user"
                        password => "pass"
			cacert => "/etc/logstash/elasticsearch-ca.pem"
			ilm_rollover_alias => "cacheaudit-cloud1"
			ilm_policy => "cache-lifecycle"
                }
        }

Did you use IPs or localhost when you created the cert... use the same in the setting

hosts => ["https://localhost:9200"]

Note my example

    hosts => ["https://10.168.0.116:9200"]

This may be an issues with your SSL Setup on the Box...

There was also a thread on this here

@stephenb i follow all the steps from the article (i deploy a new environment) but,
im getting this error

[2022-03-03T11:29:46,552][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"Elasticsearch Unreachable: [https://10.0.0.114:9200/_xpack][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}
[2022-03-03T11:30:16,457][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}

when i curl my Elasticsearch i have the same exit that in article.

my problem is just with logstash, Elasticsearch and kibana are working good.

i used the same certificates for all (Elasticsearch, kibana and logstash)

in my browser i can access https://10.0.0.114:9200/_xpack and after put the user and pass to login, i see this

{"build":{"hash":"e5acb99f822233d62d6444ce45a4543dc1c8059a","date":"2022-02-23T22:20:54.153567231Z"},"license":{"uid":"8d62208b-4ef5-4cc4-ab42-16a6de4639a5","type":"basic","mode":"basic","status":"active"},"features":{"aggregate_metric":{"available":true,"enabled":true},"analytics":{"available":true,"enabled":true},"ccr":{"available":false,"enabled":true},"data_streams":{"available":true,"enabled":true},"data_tiers":{"available":true,"enabled":true},"enrich":{"available":true,"enabled":true},"eql":{"available":true,"enabled":true},"frozen_indices":{"available":true,"enabled":true},"graph":{"available":false,"enabled":true},"ilm":{"available":true,"enabled":true},"logstash":{"available":false,"enabled":true},"ml":{"available":false,"enabled":true,"native_code_info":{"version":"7.17.1","build_hash":"6d8a28b39bf223"}},"monitoring":{"available":true,"enabled":true},"rollup":{"available":true,"enabled":true},"searchable_snapshots":{"available":false,"enabled":true},"security":{"available":true,"enabled":true},"slm":{"available":true,"enabled":true},"spatial":{"available":true,"enabled":true},"sql":{"available":true,"enabled":true},"transform":{"available":true,"enabled":true},"voting_only":{"available":true,"enabled":true},"watcher":{"available":false,"enabled":true}},"tagline":"You know, for X"}

@stephenb after deploy 3 environments and do many tests, your article help me to resolve the problem.

thanks so much

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.