Elastic 6.3 problems with SSL Kibana connection


(Justin Wells) #1

I am having difficulty with Kibana 6.3.0 connecting to Elasticsearch 6.3.0 via SSL.

Setup is a single Elastic node in a dev environment with security enabled. Kibana is able to connect to the ELS node and get license information.

Kibana SSL config:

elasticsearch.url: "https://<hostname>:9200"

elasticsearch.username: "kibana"
elasticsearch.password: <password>

server.ssl.enabled: true
server.ssl.certificate: /etc/kibana/<filename>.cer
server.ssl.key: /etc/kibana/<filename>.key

elasticsearch.ssl.certificate: /etc/kibana/elastic.cer
elasticsearch.ssl.key: /etc/kibana/elastic.key

The elastic.key and elastic.cer files are PEM encoded files that correspond to the pkcs12 files used in elasticsearch as config'd below. The PEM ended files were generated using open ssl to convert pkcs12 to pem.

Elasticsearch SSL config:

xpack.security.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.keystore.path: /etc/elasticsearch/certs/elastic.p12
xpack.security.transport.ssl.truststore.path: /etc/elasticsearch/certs/elastic.p12

xpack.security.http.ssl.enabled: true
xpack.security.http.ssl.key: /etc/elasticsearch/certs/<filename>.key
xpack.security.http.ssl.certificate: /etc/elasticsearch/certs/<filename>.cer

xpack:
  security:
   authc:
  realms:
    active_directory:
      type: active_directory
      order: 0
      domain_name: <domain.name>
      url: ldaps://<hostname>:636 
      ssl:
        certificate_authorities: ["< ca.cer>"]
        verification_mode: certificate
      user_search.base_dn: OU=Users,OU=<OU>,DC=<DC>,DC=com
      group_search.base_dn: OU=Roles,OU=<OU>,OU=Groups,OU=<OU>,DC=<DC>,DC=com
      load_balance:
        type: round_robin

I'll post the logs in a reply to stay within post length limit.


(Justin Wells) #2

Elasticsearch Log:

[2018-06-29T07:00:02,338][WARN ][o.e.x.s.t.n.SecurityNetty4HttpServerTransport] [i-095847a50f05e7e7b] caught exception while handling client http traffic, closing connection [id: 0x10641c98, L:0.0.0.0/0.0.0.0:9200 ! R:/127.0.0.1:21084]
io.netty.handler.codec.DecoderException: io.netty.handler.ssl.NotSslRecordException: not an SSL/TLS record: 474554202f20485454502f312e310d0a486f73743a203132372e302e302e313a393230300d0a4163636570742d456e636f64696e673a206964656e746974790d0a636f6e6e656374696f6e3a206b6565702d616c6976650d0a636f6e74656e742d747970653a206170706c69636174696f6e2f6a736f6e0d0a0d0a
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:459) ~[netty-codec-4.1.16.Final.jar:4.1.16.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) ~[netty-codec-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:545) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:499) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) [netty-transport-4.1.16.Final.jar:4.1.16.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) [netty-common-4.1.16.Final.jar:4.1.16.Final]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
Caused by: io.netty.handler.ssl.NotSslRecordException: not an SSL/TLS record: 474554202f20485454502f312e310d0a486f73743a203132372e302e302e313a393230300d0a4163636570742d456e636f64696e673a206964656e746974790d0a636f6e6e656374696f6e3a206b6565702d616c6976650d0a636f6e74656e742d747970653a206170706c69636174696f6e2f6a736f6e0d0a0d0a
at io.netty.handler.ssl.SslHandler.decodeJdkCompatible(SslHandler.java:1106) ~[?:?]
at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1162) ~[?:?]
at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:489) ~[?:?]
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:428) ~[?:?]

(Justin Wells) #3

Kibana log:

Jun 29 17:24:21 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:21Z","tags":["license","info","xpack"],"pid":2747,"message":"Imported license information from Elasticsearch for the [monitoring] cluster: mode: gold | status: active | expiry date: 2019-05-31T23:59:59+00:00"}
Jun 29 17:24:23 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:23Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Received Kibana Ops event data"}
Jun 29 17:24:23 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:23Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:26 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:26Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:28 ip-10-224-36-234 kibana[2747]: {"type":"ops","@timestamp":"2018-06-29T17:24:28Z","tags":[],"pid":2747,"os":{"load":[0.2216796875,0.04931640625,0.01611328125],"mem":{"total":16825729024,"free":15852728320},"uptime":68184},"proc":{"uptime":11.61,"mem":{"rss":209600512,"heapTotal":159735808,"heapUsed":127384040,"external":1912790},"delay":0.3979109972715378},"load":{"requests":{},"concurrents":{"5601":0},"responseTimes":{},"sockets":{"http":{"total":0},"https":{"total":0}}},"message":"memory: 121.5MB uptime: 0:00:12 load: [0.22 0.05 0.02] delay: 0.398"}
Jun 29 17:24:28 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:28Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Received Kibana Ops event data"}
Jun 29 17:24:28 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:28Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from kibana collector"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from kibana_stats collector"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from kibana_settings collector"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags": ["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from reporting_stats collector"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["reporting","debug","exportTypes"],"pid":2747,"message":"Found exportType at /usr/share/kibana/node_modules/x-pack/plugins/reporting/export_types/csv/server/index.js"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["reporting","debug","exportTypes"],"pid":2747,"message":"Found exportType at /usr/share/kibana/node_modules/x-pack/plugins/reporting/export_types/printable_pdf/server/index.js"}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"[null] default admin email setting found, sending [kibana_settings] monitoring document."}
Jun 29 17:24:31 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:31Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Uploading bulk Kibana monitoring payload"}
Jun 29 17:24:33 ip-10-224-36-234 kibana[2747]: {"type":"ops","@timestamp":"2018-06-29T17:24:33Z","tags":[],"pid":2747,"os":{"load":[0.2841796875,0.0654296875,0.021484375],"mem":{"total":16825729024,"free":15852568576},"uptime":68189},"proc":{"uptime":16.611,"mem":{"rss":209600512,"heapTotal":159735808,"heapUsed":129320400,"external":2069423},"delay":0.3521210104227066},"load":{"requests":{},"concurrents":{"5601":0},"responseTimes":{},"sockets":{"http":{"total":0},"https":{"total":0}}},"message":"memory: 123.3MB uptime: 0:00:17 load: [0.28 0.07 0.02] delay: 0.352"}
Jun 29 17:24:33 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:33Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Received Kibana Ops event data"}
Jun 29 17:24:33 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:33Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:36 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:36Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:38 ip-10-224-36-234 kibana[2747]: {"type":"ops","@timestamp":"2018-06-29T17:24:38Z","tags":[],"pid":2747,"os":{"load":[0.26123046875,0.06396484375,0.02099609375],"mem":{"total":16825729024,"free":15852441600},"uptime":68194},"proc":{"uptime":21.613,"mem":{"rss":209600512,"heapTotal":159735808,"heapUsed":130299000,"external":2153445},"delay":0.32137200236320496},"load":{"requests":{},"concurrents":{"5601":0},"responseTimes":{},"sockets":{"http":{"total":0},"https":{"total":0}}},"message":"memory: 124.3MB uptime: 0:00:22 load: [0.26 0.06 0.02] delay: 0.321"}
Jun 29 17:24:38 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:38Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Received Kibana Ops event data"}
Jun 29 17:24:38 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:38Z","tags":["plugin","debug"],"pid":2747,"message":"Checking Elasticsearch version"}
Jun 29 17:24:41 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:41Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from kibana collector"}
Jun 29 17:24:41 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:41Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"Fetching data from kibana_stats collector"}
Jun 29 17:24:41 ip-10-224-36-234 kibana[2747]: {"type":"log","@timestamp":"2018-06-29T17:24:41Z","tags":["debug","monitoring-ui","kibana-monitoring"],"pid":2747,"message":"not sending [kibana_settings] monitoring document because [undefined] is null or invalid."}

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.