Can't get encryption to Logstash working

I've created elasticsearch users to secure communications between the stack (all on one machine).
Now I need to encrypt communications to logstash from remote machines.

I'm redoing my step because I just don't know what to do and where things go wrong. I'm on it for a few days and just can't get it to work.

Following the generating nodes certificates doc, I've created two certificates using these commands (this was performed on the machine that hosts logstash, elasticsearch and kibana):

bin/elasticsearch-certutil ca
bin/elasticsearch-certutil cert --ca elastic-stack-ca.p12

Then extracted them to .crt and .key by doing:

openssl pkcs12 -in elastic-stack-ca.p12 -out elkCA.crt -nodes
openssl pkcs12 -in elastic-certificates.p12 -out elk.key -nodes -nocert
openssl pkcs12 -in elastic-certificates.p12 -out elk.crt -nodes

Then, in logstash config, in the input { } I have:

input{
        beats{
        port => 5044
        ssl => true
        ssl_certificate_authorities => ["/usr/share/elasticsearch/elkCA.crt"]
        ssl_certificate => "/usr/share/elasticsearch/elk.crt"
        ssl_key => "/usr/share/elasticsearch/elk.key"
        ssl_verify_mode => "force_peer"
        }

This is what I get if I run logstash.

    [2020-07-29T11:48:41,388][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
    [2020-07-29T11:48:42,196][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>java.security.cert.CertificateParsingException: signed fields invalid, :backtrace=>["sun.security.x509.X509CertImpl.parse(sun/security/x509/X509CertImpl.java:1842)", "sun.security.x509.X509CertImpl.<init>(sun/security/x509/X509CertImpl.java:195)", "sun.security.provider.X509Factory.parseX509orPKCS7Cert(sun/security/provider/X509Factory.java:471)", "sun.security.provider.X509Factory.engineGenerateCertificates(sun/security/provider/X509Factory.java:356)", "java.security.cert.CertificateFactory.generateCertificates(java/security/cert/CertificateFactory.java:462)", "org.logstash.netty.SslContextBuilder.loadCertificateCollection(org/logstash/netty/SslContextBuilder.java:144)", "org.logstash.netty.SslContextBuilder.buildContext(org/logstash/netty/SslContextBuilder.java:117)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:426)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:293)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.create_server(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:181)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$create_server$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.register(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:157)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$register$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:216)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:215)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_inputs(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:326)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:286)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.run(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:170)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$run$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)", "java.lang.Thread.run(java/lang/Thread.java:748)"], "pipeline.sources"=>["/etc/logstash/conf.d/test.conf"], :thread=>"#<Thread:0x740ba01b run>"}
    [2020-07-29T11:48:42,225][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
    [2020-07-29T11:48:42,569][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
    [2020-07-29T11:48:47,368][INFO ][logstash.runner          ] Logstash shut down.

 

 

For troubleshooting, in logstash conf's input { } block, I commented out ssl_certificate_authorities and ssl_verify_mode(so the Logstsah config's input{} looks like this). Then ran logstash again and I get this text thrown endlessly (don't think it's an error). Then, I've copied elk.key and elk.crt to the remote machine with filebeat. On that machine, in filebeat.yml, I've `pointed to the certs:

output.logstash:
  # The Logstash hosts
  hosts: ["91.239.19.152:5044"] #not the real ip

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  ssl.certificate: "/etc/elk/certs/elk.crt"

  # Client Certificate Key
  ssl.key: "/etc/elk/certs/elk.key"

When I then start filebeat, I get this error:

2020-07-29T14:32:32.594Z        ERROR   [publisher_pipeline_output]     pipeline/output.go:155  Failed to connect to backoff(async(tcp://92.240.15.230:5044)): x509: cannot validate certificate for 92.240.15.230 because it doesn't contain any IP SANs
2020-07-29T14:32:32.594Z        INFO    [publisher_pipeline_output]     pipeline/output.go:146  Attempting to reconnect to backoff(async(tcp://92.240.15.230:5044)) with 13 reconnect attempt(s)

Does anyone know what I did wrong? I'm frustrated and I just feel lost with it. Huge thanks ahead.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.