"No Living Connection" in Kibana 7.1.0 while using Default security provided in 7.1

Hi Team,
I was going through with this training on "Fundamentals of Securing Elastic"

So, in this I have done the setup of inter-node SSL communication using:

/usr/share/elasticsearch/bin/elasticsearch-certutil ca
/usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 

By doing so I got an error as below:
Error: Caused by: java.security.AccessControlException: access denied ("java.io.FilePermission" "/usr/share/elasticsearch/elastic-certificates.p12" "read"
But this I managed to resolve as somewhere I read that we need to place the certs in /etc/elasticsearch by putting the same as below in elasticsearch.yml:

xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate 
xpack.security.transport.ssl.keystore.path: /etc/elasticsearch/elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: /etc/elasticsearch/elastic-certificates.p12
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: /etc/elasticsearch/elastic-certificates.p12 
xpack.security.http.ssl.truststore.path: /etc/elasticsearch/elastic-certificates.p12

Now if I run

curl -XGET https://localhost:9200

I get an error as below:

curl: (60) SSL certificate problem: self signed certificate in certificate chain
More details here: curl - SSL CA Certificates
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.

But running the same on browser works fine.. So, anyways I can get the curl working by doing something.

++++++++++++

Next I installed kibana and tried to setup the Kibana with https of elastic. For that I ran the command below to generate crt based certificate as .p12 does not works. For that I generated the same using same ca with PEM format:

/usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 --pem

Next I modified the kibana.yml to have the certs and url as:

elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.ssl.certificateAuthorities: /etc/kibana/instance.crt

Now while restarting I am getting the error in kibana as "No Living Connection":

May 22 11:03:34 ip-172-31-28-36 kibana: {"type":"log","@timestamp":"2019-05-22T11:03:34Z","tags":["warning","elasticsearch","admin"],"pid":21957,"message":"Unable to revive connection: https://localhost:9200/"}
May 22 11:03:34 ip-172-31-28-36 kibana: {"type":"log","@timestamp":"2019-05-22T11:03:34Z","tags":["warning","elasticsearch","admin"],"pid":21957,"message":"No living connections"}

Below is my consolidated elasticsearch.yml and kibana.yml
Elasticsearch.yml:

cluster.name: elk
network.host: localhost
http.port: 9200
path.data: /var/lib/elasticsearch
path.logs: /var/log/elasticsearch
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: /etc/elasticsearch/elastic-certificates.p12
xpack.security.transport.ssl.truststore.path: /etc/elasticsearch/elastic-certificates.p12
xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.keystore.path: /etc/elasticsearch/elastic-certificates.p12
xpack.security.http.ssl.truststore.path: /etc/elasticsearch/elastic-certificates.p12

Kibana.yml:

server.port: 5601
server.host: "0.0.0.0"
elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.ssl.certificateAuthorities: /etc/kibana/instance.crt
elasticsearch.ssl.key: /etc/kibana/instance.key

Any idea what else I am missing here? As running https://localhost:9200 in browser works fine but kibana does not start.

You likely will want to remove elasticsearch.ssl.key from your kibana.yml. You don't need to set this unless you're using PKI to authenticate the Kibana internal server user, and if you'd like to do so you'll have to set elasticsearch.ssl.cert in addition to elasticsearch.ssl.key.

From the server running Kibana, could you execute the following curl, making the appropriate substitutions, and reply with the results:

curl -u elastic:<insert-password> https://localhost:9200 --capath /etc/kibana/instance/crt

@Brandon_Kobel : I did tried the following combination but I am getting the error with both of the combination as below.

elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.ssl.certificateAuthorities: /etc/kibana/instance.crt

As well as

elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.ssl.certificateAuthorities: /etc/kibana/instance.crt
elasticsearch.ssl.key: /etc/kibana/instance.key
elasticsearch.ssl.certificate: /etc/kibana/instance.crt

But in both of above cases I am getting same error.

Also, below is the output of command you need. So, when I run with -k it works whereas without -k it doesn't.

[root@ip-172-31-28-36 root]# curl -u elastic:elastic https://localhost:9200 --capath /etc/kibana/instance.crt
curl: (60) SSL certificate problem: self signed certificate in certificate chain
More details here: curl - SSL CA Certificates

curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it. To learn more about this situation and
how to fix it, please visit the web page mentioned above.

And with -k

[root@ip-172-31-28-36 root]# curl -k -u elastic:elastic https://localhost:9200 --capath /etc/kibana/instance.crt
{
"name" : "ip-172-31-28-36.us-east-2.compute.internal",
"cluster_name" : "elk",
"cluster_uuid" : "pdeSeW_YS-2_qq7ks2eAfQ",
"version" : {
"number" : "7.1.0",
"build_flavor" : "default",
"build_type" : "rpm",
"build_hash" : "606a173",
"build_date" : "2019-05-16T00:43:15.323135Z",
"build_snapshot" : false,
"lucene_version" : "8.0.0",
"minimum_wire_compatibility_version" : "6.8.0",
"minimum_index_compatibility_version" : "6.0.0-beta1"
},
"tagline" : "You Know, for Search"
}

Let me know if you need anything further to analyze the same.

Thanks!

Using -k ignores certificate errors, which negates what we're trying to test here. We want to ensure that the certificate that we're specifying in elasticsearch.ssl.certificateAuthorities can validate the certificates which Elasticsearch is using.

You'll need to extract the CA's certificate from the elastic-stack-ca.p12 by using the following command, and specify a path to this file in your elasticsearch.ssl.certificateAuthorities:

openssl pkcs12 -in elastic-stack-ca.p12 -out ca.crt -cacerts -nokeys

IT works for me. you have to change ownership. here is what I did

https://discuss.elastic.co/t/problem-on-elasticsearch-document/182093/2

cd /usr/share/elasticsearch/
bin/elasticsearch-certutil cert -out /etc/elasticsearch/config/elastic-certificates.p12 -pass ""
chown -R elasticsearch:elasticsearch /etc/elasticsearch/config



Add following line to /etc/elasticsearch/elasticsearch.yml

xpack.security.enabled: true

xpack.security.transport.ssl.enabled: true

xpack.security.transport.ssl.verification_mode: certificate

xpack.security.transport.ssl.keystore.path: /etc/elasticsearch/config/elastic-certificates.p12

xpack.security.transport.ssl.truststore.path: /etc/elasticsearch/config/elastic-certificates.p12


Now kibana setup

On /etc/kibana/kibana.yml file following two line

elasticsearch.username: "kibana"

elasticsearch.password: "kibana"

xpack.security.enabled: true

@elasticforme it's working fine for TSL, but not working for https.

@Brandon_Kobel while running the command as below I am getting the ca.crt file as 0 size:

[root@ip-172-31-28-36 elasticsearch]# openssl pkcs12 -in elastic-stack-ca.p12 -out ca.crt -cacerts -nokeys
Enter Import Password:
MAC verified OK
[root@ip-172-31-28-36 elasticsearch]# ls -lrt
total 500
-rw-r--r-- 1 root root 8478 May 16 00:40 README.textile
-rw-r--r-- 1 root root 13675 May 16 00:40 LICENSE.txt
-rw-rw-r-- 1 root root 447478 May 16 00:45 NOTICE.txt
drwxr-xr-x 2 root root 6 May 16 00:54 plugins
drwxr-xr-x 2 root root 4096 May 22 09:29 bin
drwxr-xr-x 8 root root 96 May 22 09:29 jdk
drwxr-xr-x 3 root root 4096 May 22 09:29 lib
drwxr-xr-x 29 root root 4096 May 22 09:29 modules
-rw------- 1 root root 2527 May 22 09:31 elastic-stack-ca.p12
-rwxrwxrwx 1 root root 3451 May 22 09:33 elastic-certificates.p12
-rw-r--r-- 1 root root 524 May 22 10:04 index.html
-rw------- 1 root root 2550 May 22 10:12 certificate-bundle.zip
drwxr-xr-x 2 root root 46 May 22 11:41 instance
-rw-r--r-- 1 root root 139 May 22 11:47 keys_out.txt
-rw-r--r-- 1 root root 0 May 23 11:32 ca.crt
[root@ip-172-31-28-36 elasticsearch]#

Any idea why I am getting the 0 size file for same.

The inclusion of the -cacerts flag is causing this. Per https://bugzilla.redhat.com/show_bug.cgi?id=1246371#c1 OpenSSL is determining whether a certificate is a CA certificate based on the key itself missing, which is breaking our usage.

You can use the following instead:

openssl pkcs12 -in elastic-stack-ca.p12 -out ca.crt -nokeys

@Brandon_Kobel No luck so far.

I ran below command to generate ca.crt and then copied the same in /etc/kibana

openssl pkcs12 -in elastic-stack-ca.p12 -out ca.crt -nokeys

After that modified the kibana.yml to include below lines:

elasticsearch.hosts: ["https://localhost:9200"]
elasticsearch.ssl.certificateAuthorities: /etc/kibana/ca.crt

But still getting the same error. :frowning:

If you run the following from the server running Kibana, what does curl return?

curl --cacert /etc/kibana/ca.crt https://localhost:9200

If that curl command returns an error about the host name not matching the certificate, you'll want to re-run /usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 and include the --dns argument:

/usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 --dns localhost

And then copy the resultant elastic-certificates.p12 file to /etc/elasticsearch/elastic-certificates.p12 and restart Elasticsearch.

Thanks @Brandon_Kobel, I am able to run a one box setup for Elastic and Kibana now with the SSL. But now as next step when I tried running the logstash it failed.

As per the docs : Elasticsearch input plugin | Logstash Reference [8.11] | Elastic
So, as per this I gave below as elastic input:

input {
elasticsearch {
hosts => "https://localhost:9200"
index => "twitter1,twitter2"
ssl => "true"
ca_file => "/etc/elasticsearch/ca.pem"
}
}

And generated the pem file as: openssl pkcs12 -clcerts -nokeys -out ca.pem -in elastic-certificates.p12

Getting the error as:

[2019-05-28T10:38:33,091][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Elasticsearch index=>"twitter1,twitter2", id=>"816c0f6ae2b73b03126c5541f18324f7b3ad7a6650a4861c5ede9d3c83c8649f", ca_file=>"/etc/elasticsearch/ca.pem", ssl=>true, hosts=>["https://localhost:9200"], enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_cce6c1fb-471b-4702-a3db-b6727de290df", enable_metric=>true, charset=>"UTF-8">, query=>"{ "sort": [ "_doc" ] }", size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"]>
Error: Failed to open TCP connection to https:0 (initialize: name or service not known)
Exception: Faraday::ConnectionFailed
Stack: org/jruby/ext/socket/RubyTCPSocket.java:138:in initialize' org/jruby/RubyIO.java:1155:in open'

Also, apart from using ES input I tried ES output as well per below config:

output {
elasticsearch {
hosts => ["https://localhost:9200"]
index => "twitter3"
document_type => "doc"
document_id => "%{c_num}"
doc_as_upsert => "true"
action => "update"
ssl => "true"
cacert => "/etc/elasticsearch/ca.pem"
}
}

But this as well does not work?

Any idea as if we need anything else for the logstash to be setup with Elastic. Cause in training docs as well I see this:

For Logstash, you need to update the Elasticsearch output. Similarly to Kibana, Logstash does not support the PKCS#12 keystore yet, so you need to add the path to a PEM file which contains the certificate of your CA:

output {
elasticsearch {
...
ssl => true
cacert => '/path/to/cert.pem'
}
}

But this does not seems to work. Any idea?

You'll want to use the ca.crt extracted from elastic-stack-ca.p12 not from elastic-certificates.p12. It's the same ca.crt which you configured with Kibana.

Hello @Brandon_Kobel I used the ca.crt as generated by
openssl pkcs12 -in elastic-stack-ca.p12 -out ca.crt -nokeys

As well as tried the instance.crt generated by:
/usr/share/elasticsearch/bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12 --pem

But both are giving the same error..

ls_conf_snippet:
[root@ip-172-31-28-36 pagrawal]# cat Aladdin_ls.conf

input {
elasticsearch {
hosts => "https://localhost:9200"
index => "twitter1,twitter2"
ssl => "true"
ca_file => "/etc/elasticsearch/ca.crt"
}
}

Error:

[2019-05-28T16:44:09,344][ERROR][logstash.javapipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Elasticsearch index=>"twitter1,twitter2", id=>"b95e5e59170b162a01d02cc7cd65c379bcb33a7623ba487ea7760e3ce7b7de1d", ca_file=>"/etc/elasticsearch/ca.crt", ssl=>true, hosts=>["https://localhost:9200"], enable_metric=>true, codec=><LogStash::Codecs::JSON id=>"json_979ce11c-6f98-4a53-acbd-547cd2f738cd", enable_metric=>true, charset=>"UTF-8">, query=>"{ "sort": [ "_doc" ] }", size=>1000, scroll=>"1m", docinfo=>false, docinfo_target=>"@metadata", docinfo_fields=>["_index", "_type", "_id"]>
Error: Failed to open TCP connection to https:0 (initialize: name or service not known)
Exception: Faraday::ConnectionFailed

Whereas curl works fine with same ca.crt:

[root@ip-172-31-28-36 pagrawal]# curl --cacert /etc/elasticsearch/ca.crt -XGET https://localhost:9200/_cat/indices?v
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
green open .kibana_task_manager TmJ9H1__Tk-VbO7cMbCQ_Q 1 0 2 2 9.3kb 9.3kb

If the certificates themselves are working, I'd recommend opening a new issue in the Logstash topic with your Logstash specific error because they'll be able to help you debug more competently than I.

Thanks a lot @Brandon_Kobel for the help on Elastic and Kibana. Let me further check the same under logstash topic.

An update: Logstash output elasticsearch works with the same file as ca.crt what we used for kibana. But it does not work for logstash input elasticsearch.

Logstash working conf:

input {
    stdin {} 
}
output {
	stdout {codec => rubydebug}
	elasticsearch {
		hosts => ["https://172.31.28.36:9200"]
		cacert => "/etc/kibana/ca.crt"
	}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.