Logstash to Logstash - invalid certificate?

Trying to implement logstash to logstash via the instructions at "Logstash-to-Logstash Communication | Logstash Reference [6.7] | Elastic" but running into several issues.

I'm using a certificate that both upstream and downstream have, along with the key, and still getting the below error:

logstash-shipper_1 | [ERROR] 2018-07-02 14:18:39.971 [[main]-pipeline-manager] lumberjack - All hosts unavailable, sleeping {:hosts=>["172.18.0.2"], :e=>#<OpenSSL::SSL::SSLError: certificate verify failed>, :backtrace=>["org/jruby/ext/openssl/SSLSocket.java:217:in `connect'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:95:in `connection_start'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:76:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:34:in `connect'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.26/lib/lumberjack/client.rb:24:in `initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-lumberjack-3.1.7/lib/logstash/outputs/lumberjack.rb:86:in `connect'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-lumberjack-3.1.7/lib/logstash/outputs/lumberjack.rb:49:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:17:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:310:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:408:in `start_pipeline'"]}

logstash-indexer_1 | [INFO ] 2018-07-02 14:18:50.001 [nioEventLoopGroup-4-1] BeatsHandler - Exception: javax.net.ssl.SSLHandshakeException: error:10000416:SSL routines:OPENSSL_internal:SSLV3_ALERT_CERTIFICATE_UNKNOWN, from: /172.18.0.3:37938

These are two docker containers talking to each other, and they communication is working correctly. Configurations are very simple:
Indexer:

input {
beats {
port => 10201
ssl => "true"
ssl_certificate => "/usr/share/logstash/certs/logstash.crt"
ssl_key => "/usr/share/logstash/certs/logstash.key"
}
}

and Shipper:

output {
lumberjack {
codec => json
hosts => [ "logstash-indexer" ]
port => 10201
ssl_certificate => "/usr/share/logstash/certs/logstash.crt"
}
}

Both using the exact same certificate. What am I missing?

Debug logs from the Beats input:

logstash-indexer_1 | [INFO ] 2018-07-02 14:52:59.259 [defaultEventExecutorGroup-5-2] BeatsHandler - [local: 0.0.0.0:10201, remote: 172.18.0.3:38010] Handling exception: javax.net.ssl.SSLHandshakeException: error:10000416:SSL routines:OPENSSL_internal:SSLV3_ALERT_CERTIFICATE_UNKNOWN
logstash-indexer_1 | [WARN ] 2018-07-02 14:52:59.259 [nioEventLoopGroup-3-3] DefaultChannelPipeline - An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
logstash-indexer_1 | io.netty.handler.codec.DecoderException: javax.net.ssl.SSLHandshakeException: error:10000416:SSL routines:OPENSSL_internal:SSLV3_ALERT_CERTIFICATE_UNKNOWN
logstash-indexer_1 | at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:459) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:141) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459) [netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) [netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at java.lang.Thread.run(Thread.java:748) [?:1.8.0_171]
logstash-indexer_1 | Caused by: javax.net.ssl.SSLHandshakeException: error:10000416:SSL routines:OPENSSL_internal:SSLV3_ALERT_CERTIFICATE_UNKNOWN
logstash-indexer_1 | at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.shutdownWithError(ReferenceCountedOpenSslEngine.java:876) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.sslReadErrorResult(ReferenceCountedOpenSslEngine.java:1124) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1080) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1146) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1189) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.SslHandler$SslEngineType$1.unwrap(SslHandler.java:216) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1248) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.SslHandler.decodeNonJdkCompatible(SslHandler.java:1171) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1196) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:489) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:428) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
logstash-indexer_1 | ... 16 more

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.