Tls issue connecting to secure elasticsearch backend

Hi,

I am running elastic stack in kubernetes and I've changed my elasticsearch to use security module mow. Meaning TLS encryption on transport and http rest api and user authorization.
Currently I am encountering issues when I want to connect logstash to the secure rest api:

[2019-08-02T08:08:38,574][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
[2019-08-02T08:08:42,905][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://logstash_ingest:xxxxxx@test-es-http:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://logstash_ingest:xxxxxx@test-es-http:9200/][Manticore::ClientProtocolException] PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"}

So that is the interesting part:
PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target

If I login to the container and try via curl, it looks nice:

curl -v --cacert $ES_CA_CERT_PATH https://logstash_ingest:xxxxxxx@$ES_HOST:$ES_PORT/?pretty
* About to connect() to test-es-http port 9200 (#0)
*   Trying 10.109.4.217...
* Connected to test-es-http (10.109.4.217) port 9200 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
*   CAfile: /usr/share/logstash/config/certs/ca.crt
 CApath: none
* NSS: client certificate not found (nickname not specified)
* SSL connection using TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
* Server certificate:
*       subject: CN=test-es-http
*       start date: Aug 02 07:11:51 2019 GMT
*       expire date: Aug 01 07:11:51 2022 GMT
*       common name: test-es-http
*       issuer: CN=Elastic Certificate Tool Autogenerated CA
* Server auth using Basic with user 'logstash_ingest'
> GET /?pretty HTTP/1.1
> Authorization: Basic bG9nc3Rhc2hfaW5nZXN0OmpqZGhzYWtoVUNHaWdkN2lnX3MoKVRVR1NKVkhqaGtnamdoaHNkZ2soc2doZ2pKR0pTR2RoaGdq
> User-Agent: curl/7.29.0
> Host: test-es-http:9200
> Accept: */*
>
< HTTP/1.1 200 OK
< content-type: application/json; charset=UTF-8
< content-length: 502
<
{
 "name" : "test-es-master-2",
 "cluster_name" : "test",
 "cluster_uuid" : "O5YCLDvxRquC9JPX9tWU2A",
 "version" : {
   "number" : "7.2.0",
   "build_flavor" : "default",
   "build_type" : "docker",
   "build_hash" : "508c38a",
   "build_date" : "2019-06-20T15:54:18.811730Z",
   "build_snapshot" : false,
   "lucene_version" : "8.0.0",
   "minimum_wire_compatibility_version" : "6.8.0",
   "minimum_index_compatibility_version" : "6.0.0-beta1"
 },
 "tagline" : "You Know, for Search"
}
* Connection #0 to host test-es-http left intact

With curl I used just the environment variables which are passed to the container and which are configured to use for the connection in deployment manifest:

In kubernetes I set the relevant environment as following:

   env:
   # elasticsearch connection used for output
      - name: ES_HOST
        value: "test-es-http"
      - name: ES_PORT
        value: "9200"
      - name: USE_ES_SSL
        value: "true"
      - name: ES_CA_CERT_PATH
        value: "/usr/share/logstash/config/certs/ca.crt"

My output configuration of the pipeline looks like this:

output
{
	elasticsearch
	{
		hosts 		=> ["${ES_HOST}:${ES_PORT}"]
		ssl 			=> "${USE_ES_SSL}"
		cacert		=> "${ES_CA_CERT_PATH}"

		# credentials are fetched from envrionment or logstash-keystore
		user 			=> "${LOGSTASH_USER}"
		password 	=> "${LOGSTASH_PASSWORD}"

		index			=> "[@metadata][indexName]"
	}
}

As I understand the error message in logstash's log, the cacert file is either not found or did not sign the certificate. But with curl it is working.

Could you please point out, where my configuration is wrong?

Thanks a lot, Andreas

I had found out some issue that CN was not matching. Now I corrected it, but the error stays. Kibana is running fine when declaring the cacert.

I know from other java applications, that I need to add the self signed CA to add to cacerts of the JVM.

Is this also needed? Or is it sufficient to declare the cacert parameter and point to CA's certificate?

The documentation for the cacert option says it wants a .cer or .pem file. Are you sure your file is in the right format?

Files are pem files, created with elasticsearch-certutil and --pem option

I made a lot of changes, somehow I got rid of the error now. As soon I have messages in redis, I can say if it definitively works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.