Secure filebeat to logstash

I was tasked to secure the whole elastic flow and was able to secure the logstash-elasticsearch-kibana flow.

I'm now having some trouble encrypting filebeat to logstash (It's the last step). According to the "secure filebeat to logstash" page, I need .crt and .key configured on both ends. Problem is, that so far only elastic-certificates.p12 and elastic-stack-ca.p12 as in this page.

Do I have to use these to file as if they're .crt and .key? Do I need to create additional files?

 

 

EDIT:

I've created 3 files using the following commands:

openssl pkcs12 -in elastic-certificates.p12 -out elkCA.crt -nodes
openssl pkcs12 -in elastic-stack-ca.p12 -out elk.crt -nodes
openssl pkcs12 -in elastic-stack-ca.p12 -out elk.key -nodes -nocerts

Then in logstash's conf:

input{
 beats{
port => 5044
ssl => true
ssl_certificate_authorities => ["/usr/share/elasticsearch/elkCA.crt"]
ssl_certificate => "/usr/share/elasticsearch/elk.crt"
ssl_key => "/usr/share/elasticsearch/elk.key"
ssl_verify_mode => "force_peer"
}
}

But when I start logstash, I'm getting an error for the certificates (this looks terrible on mobile):

[2020-07-28T13:46:18,921][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-07-28T13:46:19,040][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-07-28T13:46:19,269][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/test.conf"], :thread=>"#<Thread:0x574db335 run>"}
[2020-07-28T13:46:21,015][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-28T13:46:21,822][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>java.security.cert.CertificateParsingException: signed fields invalid, :backtrace=>["sun.security.x509.X509CertImpl.parse(sun/security/x509/X509CertImpl.java:1842)", "sun.security.x509.X509CertImpl.<init>(sun/security/x509/X509CertImpl.java:195)", "sun.security.provider.X509Factory.parseX509orPKCS7Cert(sun/security/provider/X509Factory.java:471)", "sun.security.provider.X509Factory.engineGenerateCertificates(sun/security/provider/X509Factory.java:356)", "java.security.cert.CertificateFactory.generateCertificates(java/security/cert/CertificateFactory.java:462)", "org.logstash.netty.SslContextBuilder.loadCertificateCollection(org/logstash/netty/SslContextBuilder.java:144)", "org.logstash.netty.SslContextBuilder.buildContext(org/logstash/netty/SslContextBuilder.java:117)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:426)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:293)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.create_server(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:181)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$create_server$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.register(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:157)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$register$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:216)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:215)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_inputs(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:326)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:286)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.run(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:170)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$run$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)", "java.lang.Thread.run(java/lang/Thread.java:748)"], "pipeline.sources"=>["/etc/logstash/conf.d/test.conf"], :thread=>"#<Thread:0x574db335 run>"}
[2020-07-28T13:46:21,851][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-07-28T13:46:22,270][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-07-28T13:46:27,166][INFO ][logstash.runner          ] Logstash shut down.

Did I miss something? Have I extracted the wrong certificates? Should I use a different method?

Thanks ahead.

@elk6
Hi,

Ive found i had to convert the certificate as logstash/java does not like the defaults AES 256 etc.

Heres an example of what i used

sudo openssl pkcs8 -in log.key -topk8 -v1 PBE-SHA1-3DES -out log.pkcs8.key
sudo chmod +r log.pkcs8.key

Thanks for the response,

That is to extract a .key, right? May I ask how to extract .crt?

Huge thanks ahead.

@elk6

Hi, hope this helps.

replace the --pass (xyz) with what ever pw you want

This creates the CA cert and key to generate the other certs
bin/elasticsearch-certutil ca --pem --days 365 --out /out2/log-ca.zip --pass (pempass)

unpack and you get ca.crt and ca.key

generate logstash cert
bin/elasticsearch-certutil cert --pem --days 365 --ca-cert /opt/ca.crt --ca-key /opt/ca.key --ca-pass (pempass) --pass (logstash key pass) --silent --in /opt/instances.yml --out /opt/logcerts.zip

generate beats cert
bin/elasticsearch-certutil cert --pem --days 365 --ca-cert /opt/ca.crt --ca-key /opt/ca.key --ca-pass (pempass) --pass (beats key pass) --name beats --out /opt/beatscerts.zip

convert logstash key as it does not work with the default encryption
sudo openssl pkcs8 -in /opt/logs.key -topk8 -v1 PBE-SHA1-3DES -out /opt/logs.pkcs8.key

read permission for logstash
sudo chmod +r logs.pkcs8.key

1 Like

Thank you SO MUCH for this!

I'm following along and I wanted to ask about the /instances.yml part.

All the elastic stack (logstash, elasticsearch and kibana) is located on one machine Do I have to create an instances.yml file?

Hi,

The instance file was just the settings for logstash. Could generate different certs for kibana and elasticsearch using the same commands and the ca crt/key created at the start.

Check the guide for the elasticsearch cert app for info on the instances.yml file

Kind regards
Philip Robson

Thanks for the response, @probson I seem to still have a similar issue.

[2020-07-30T08:47:18,103][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-30T08:47:18,811][ERROR][logstash.javapipeline    ][main] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>java.lang.IllegalArgumentException: File does not contain valid private key: /usr/share/elasticsearch/elk/elkpkcs8.key, :backtrace=>["io.netty.handler.ssl.SslContextBuilder.keyManager(io/netty/handler/ssl/SslContextBuilder.java:270)", "io.netty.handler.ssl.SslContextBuilder.forServer(io/netty/handler/ssl/SslContextBuilder.java:90)", "org.logstash.netty.SslContextBuilder.buildContext(org/logstash/netty/SslContextBuilder.java:104)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:426)", "org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:293)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.create_server(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:181)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$create_server$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.register(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb:157)", "usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java.lib.logstash.inputs.beats.RUBY$method$register$0$__VARARGS__(usr/share/logstash/vendor/bundle/jruby/$2_dot_5_dot_0/gems/logstash_minus_input_minus_beats_minus_6_dot_0_dot_9_minus_java/lib/logstash/inputs//usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-beats-6.0.9-java/lib/logstash/inputs/beats.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:216)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.register_plugins(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:215)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$register_plugins$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_inputs(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:326)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_inputs$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:286)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$start_workers$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.run(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:170)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$run$0$__VARARGS__(usr/share/logstash/logstash_minus_core/lib/logstash//usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:318)", "java.lang.Thread.run(java/lang/Thread.java:748)"], "pipeline.sources"=>["/etc/logstash/conf.d/test.conf"], :thread=>"#<Thread:0x63dabdf0 run>"}
[2020-07-30T08:47:18,835][ERROR][logstash.agent           ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}
[2020-07-30T08:47:19,163][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-07-30T08:47:24,134][INFO ][logstash.runner          ] Logstash shut down.

 

 

Just to redo my steps:

I've created an instances.yml file containing:

instances:
  - name: "elk" 
    ip: 
      - "1.2.3.4" #ip oh server that hosts elasticstack

made the keys, converted to pcks8, and made sure it has read permission:

#ll /usr/share/elasticsearch/elk/
total 12
-rw-r--r-- 1 root root 1164 Jul 30 08:35 elk.crt
-rw-r--r-- 1 root root 1766 Jul 30 08:35 elk.key
-rw-r--r-- 1 root root 1785 Jul 30 08:45 elkpkcs8.key

 ll /usr/share/elasticsearch/ca
total 8
-rw-r--r-- 1 root root 1200 Jul 29 18:49 ca.crt
-rw-r--r-- 1 root root 1766 Jul 29 18:49 ca.key

Then in the logstah conf:

input{
beats{
port => 5044
ssl => true
ssl_certificate_authorities => ["/usr/share/elasticsearch/ca/ca.crt"]
ssl_certificate => "/usr/share/elasticsearch/elk/elk.crt"
ssl_key => "/usr/share/elasticsearch/elk/elkpkcs8.key"
ssl_verify_mode => "force_peer"
}
}

@elk6

Good morning,

Did you set a password for the keys? If so you will need to addbelow with the password used for the elk key
ssl_key_passphrase => ""

Same will have to be done on the beats agent yml with the ca crt, beats crt/key and password/phrase

kind regards
Phil

1 Like

Thanks for the resoinse, it definitely did something!

It now loops through this:

Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-07-30T09:41:23,671][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command lin
[2020-07-30T09:41:23,814][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 
inux-x86_64]"}
[2020-07-30T09:41:27,560][INFO ][org.reflections.Reflections] Reflections took 52 ms to scan 1 urls, producing 21 keys and 41 values
[2020-07-30T09:41:28,842][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :ad
[2020-07-30T09:41:29,178][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://logstash_in
[2020-07-30T09:41:29,257][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-07-30T09:41:29,267][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won'
[2020-07-30T09:41:29,377][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::Elastic
[2020-07-30T09:41:29,465][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-07-30T09:41:29,627][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_pat
1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type
>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>
"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-07-30T09:41:29,899][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "p
["/etc/logstash/conf.d/test.conf"], :thread=>"#<Thread:0xdb7be29 run>"}
[2020-07-30T09:41:31,678][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-07-30T09:41:32,121][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-07-30T09:41:32,210][INFO ][org.logstash.beats.Server][main][7b03bb9df2bf5ca8143b76e8a09d667f1c67ee92e0dcc1e6ad3c0e815eea459f] 
[2020-07-30T09:41:32,217][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_
[2020-07-30T09:41:32,630][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-07-30T09:41:35,538][INFO ][org.logstash.beats.BeatsHandler][main][7b03bb9df2bf5ca8143b76e8a09d667f1c67ee92e0dcc1e6ad3c0e815eea
rror:100000f7:SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER
[2020-07-30T09:41:35,550][WARN ][io.netty.channel.DefaultChannelPipeline][main][7b03bb9df2bf5ca8143b76e8a09d667f1c67ee92e0dcc1e6ad3c
y means the last handler in the pipeline did not handle the exception.
io.netty.handler.codec.DecoderException: javax.net.ssl.SSLHandshakeException: error:100000f7:SSL routines:OPENSSL_internal:WRONG_VER
        at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:472) ~[netty-all-4.1.30.Final.jar:4.1.30
        at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278) ~[netty-all-4.1.30.Final.jar:4.1.3
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) ~[netty-all-4.1.
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) ~[netty-all-4.1.
        at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) ~[netty-all-4.1.30
        at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434) ~[netty-all-4.1.30.Fina
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) ~[netty-all-4.1.
        at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) ~[netty-all-4.1.
        at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965) ~[netty-all-4.1.30.Final.jar:4.1
        at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) ~[netty-all-4.1.30.Final.
        at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) ~[netty-all-4.1.30.Final.jar:4.1.30
        at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897) [netty-all-4.1.30.Final.jar:
        at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-all-4.1.30.Final.jar:4.1.30.
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_252]
Caused by: javax.net.ssl.SSLHandshakeException: error:100000f7:SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER
        at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.sslReadErrorResult(ReferenceCountedOpenSslEngine.java:1140) ~[netty-al
        at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1101) ~[netty-all-4.1.30.Fin
        at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1169) ~[netty-all-4.1.30.Fin
        at io.netty.handler.ssl.ReferenceCountedOpenSslEngine.unwrap(ReferenceCountedOpenSslEngine.java:1212) ~[netty-all-4.1.30.Fin
        at io.netty.handler.ssl.SslHandler$SslEngineType$1.unwrap(SslHandler.java:216) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.handler.ssl.SslHandler.unwrap(SslHandler.java:1297) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.handler.ssl.SslHandler.decodeNonJdkCompatible(SslHandler.java:1211) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.handler.ssl.SslHandler.decode(SslHandler.java:1245) ~[netty-all-4.1.30.Final.jar:4.1.30.Final]
        at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441) ~[netty-all-4.1.30.Final.jar:4.1.30
        ... 16 more

Is this possibly occurring from all the other servers trying to send logs without being configured with SSL? Or is it an error with the configuration for logstash?

Also you mentioned that the beats would need to also have the SSL password. Do you happen to know the line to include it as well? And if in filebeat "hosts" I can refer to the elk server by ip?

Huge thanks ahead!

Hi,

Below is the config for beats, you will need to go through all your beats agents and update the yml file and include the relevent files.
ssl.enabled: true
ssl.certificate_authorities: ["ca.crt"]
ssl.certificate: ".crt"
ssl.key: ".key"
ssl.key_passphrase: ''
ssl.verification_mode: full

1 Like

@elk6

Sorry didnt see your host/ip question, i do not see why not, you used the IP in the instances.yml.
If you do use DNS for communication to logstash and between your elastic services then you will need to modify the instances.yml and include the domain names and recreate/re-issue the certs

Thank you so much for the response. I've did just that and got a new type of error.

2020-07-30T12:54:10.082Z        ERROR   [publisher_pipeline_output]     pipeline/output.go:155  Failed to connect to backoff(async(tcp://91.242.11.220:5044)): x509: certificate signed by unknown authority (possibly because of "crypto/rsa: verification error" while trying to verify candidate authority certificate "Elastic Certificate Tool Autogenerated CA")

filebeat.yml:

output.logstash:
  # The Logstash hosts
  hosts: ["91.242.11.220:5044"] # Not the real IP
  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  ssl.enabled: true
  ssl.certificate_authorities: ["/etc/elk/certs/ca.crt"] # 
  # Certificate for SSL client authentication
  ssl.certificate: "/etc/elk/beatcert/beats.crt"

  # Client Certificate Key
  ssl.key: "/etc/elk/beatcert/beats.key"
  ssl.key_passphrase: "password" #Not the real password
  ssl.verification_mode: full

@elk6
The CA cert used would have to be the same as ```
ssl_certificate_authorities => ["/usr/share/elasticsearch/ca/ca.crt"]

As this is the certificate that signed/generated the other certificates
1 Like

Yeah my bad, I meant to say "Made sure it's the same as the the ca.crt that is in logstash's .conf"

The ca.crt in ssl_certificate_authorities => ["/usr/share/elasticsearch/ca/ca.crt"] is the same ca.crt in ssl.certificate_authorities: ["/etc/elk/certs/ca.crt"].

I scp'd it.

it does suggest that it has an issue with ssl.certificate_authorities:
Ive checked my config on my filebeat linux node and cannot see anything (i use windows for most beats) and i didnt seem to make any notes regarding the linux node. I remember having fun when using a godaddy cert and trying to get that working.

The only difference i can see in the notes is that i used DNS within the instances.yml file.

Could you # out the ```
ssl.certificate_authorities: ["/etc/elk/certs/ca.crt"]

If I comment out that line, it throws a different error:

 ERROR   [publisher_pipeline_output]     pipeline/output.go:155  Failed to connect to backoff(async(tcp://92.240.15.230:5044)): x509: certificate signed by unknown authority

@elk6
Its acknowledging the file at least. Im afraid im not sure where to go with this one as i cannot remember if i had to do anything else. I do remember receiving an error suggesting it didnt work but then it did.

Do you have access to a windows machine to try beats on as i definatly know that i did not need to do anything else other than point the beats yml at the files

1 Like

I do but I don't think it would be much different all the machines I'll have to take logs from are Linux machines. You've been a huge help so far, thank you. Hopefully I'll solve this one missing piece.

@elk6
No problem, just another check , the CA.crt is the same used to create the beats and elk crt and only the elk.crt used for logstash was converted to PBE-SHA1-3DES?

1 Like

Yeah as far as i can recall but I'm so lost at this point I'm not sure anymore. I'll try it all again.