Hello there,
I set up the SSL on all our ELK nodes (2 Elastinodes, L&K on another node), all work fine.
Without setting up SSL on Beat hosts, everything worked fine.
Now we were requested to set up the SSL/TLS between the Beat servers and Logstash, but ran into issues.
Here are the configuration on filebeat.yml for SSL:
output.logstash:
# The Logstash hosts
hosts: ["logstashhostname:5044"]
ssl.certificate_authorities: ["/opt/keys/ChainBundle2.crt"]
ssl.certificate: "/opt/keys/ServerCertificate.crt"
ssl.key: "/opt/keys/xxxx-key.pem"
here is the pipeline configuration file for Beats:
input {
beats {
port => 5044
client_inactivity_timeout => 120
ssl => true
ssl_certificate_authorities => ["/etc/logstash/keys/ChainBundle2.crt"]
ssl_certificate => "/etc/logstash/keys/ServerCertificate.crt"
ssl_key => "/etc/logstash/keys/xxxx-key.pem"
ssl_verify_mode => "force_peer"
}
}
# The filter part of this file is commented out to indicate that it
# is optional.
# filter {
#
# }
output {
elasticsearch {
user => "logstash_ingest"
password => "changeme"
ssl => true
ssl_certificate_verification => true
cacert => "/etc/logstash/keys/ChainBundle2.crt"
action => "index"
hosts => ["elasticnodehostname"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
====================
Here is the errors in logstash.log (there is only one filebeat connecting to logstash now for testing_:
[2018-10-16T21:16:16,593][ERROR][logstash.inputs.beats ] Looks like you either have a bad certificate, an invalid key or your private key was not in PKCS8 format.
[2018-10-16T21:16:16,593][WARN ][io.netty.channel.ChannelInitializer] Failed to initialize a channel. Closing: [id: 0x396d22d0, L:/10.100.35.68:5044 - R:/10.100.12.118:41314]
java.lang.IllegalArgumentException: File does not contain valid private key: /etc/logstash/keys/hls-201710-dxc-key.pem
at io.netty.handler.ssl.SslContextBuilder.keyManager(SslContextBuilder.java:267) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
........
Caused by: java.security.spec.InvalidKeySpecException: Neither RSA, DSA nor EC worked
at io.netty.handler.ssl.SslContext.getPrivateKeyFromByteBuffer(SslContext.java:1045) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.SslContext.toPrivateKey(SslContext.java:1014) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
... 20 more
Caused by: java.security.spec.InvalidKeySpecException: java.security.InvalidKeyException: IOException : algid parse error, not a sequence
at sun.security.ec.ECKeyFactory.engineGeneratePrivate(ECKeyFactory.java:169) ~[sunec.jar:1.8.0_181]
at java.security.KeyFactory.generatePrivate(KeyFactory.java:372) ~[?:1.8.0_181]
at io.netty.handler.ssl.SslContext.getPrivateKeyFromByteBuffer(SslContext.java:1043) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.SslContext.toPrivateKey(SslContext.java:1014) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.SslContextBuilder.keyManager(SslContextBuilder.java:265) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
... 20 more
Caused by: java.security.InvalidKeyException: IOException : algid parse error, not a sequence
at sun.security.pkcs.PKCS8Key.decode(PKCS8Key.java:352) ~[?:1.8.0_181]
at sun.security.pkcs.PKCS8Key.decode(PKCS8Key.java:357) ~[?:1.8.0_181]
at sun.security.ec.ECPrivateKeyImpl.<init>(ECPrivateKeyImpl.java:73) ~[sunec.jar:1.8.0_181]
........
===========================
Here is the errors on filebeat.log on filebeat server where we have hadoop running on:
2018-10-16T15:46:14.647-0500 INFO log/harvester.go:251 Harvester started for file: /var/log/messages
2018-10-16T15:46:14.696-0500 INFO pipeline/output.go:95 Connecting to backoff(async(tcp://Logstashhostname:5044))
2018-10-16T15:46:15.721-0500 ERROR pipeline/output.go:100 Failed to connect to backoff(async(tcp://Logstashhostname:5044)): read tcp filebeatserverIP:58254->logstashserverIP:5044: read: connection reset by peer
2018-10-16T15:46:15.721-0500 INFO pipeline/output.go:93 Attempting to reconnect to backoff(async(tcp://Logstashhostname:5044)) with 1 reconnect attempt(s)
2018-10-16T15:46:17.732-0500 ERROR pipeline/output.go:100 Failed to connect to backoff(async(tcp://Logstashhostname:5044)): read tcp filebeatserverIP:58264->logstashserverIP:5044
....
======================================
Any thing that is missing or incorrect? Please help.
Thanks in advance
Li