Logstash security best practices: keystore, monitoring, ssl_verification_mode

I've done some research, and i hope this help people who might have the same question as me :sweat_smile:

if u guys have some improvement or remarks please suggest !

for your logstash node to work properly after installing it and be able to monitor it u have to write a pipeline.conf in the conf.d directory (/etc/logstash/conf.d) give it the appropriate permissions/user,group (I wrote a pipeline for my logs from my pfsense firewall to logstash and then output them to elasticsearch)
this wasn't in the documentation, I would be so grateful if u guys can add it to it
optionally set the JVM heap size in jvm.options to half of your server's ram I have 8 so 4 -Xms4g and -Xmx4g

the best way to monitor logstash in my opinion is using metricbeat as I felt like elastic agent+logstash integration were not quite mature yet
here is how to set I up Collect Logstash monitoring data with Metricbeat | Logstash Reference [8.15] | Elastic

for the cluster_uuid:

monitoring.cluster_uuid: PRODUCTION_ES_CLUSTER_UUID

if you wrote a pipeline and it worked properly logstash should be able to identify the cluster id without any issue but sometimes you will have to precise it in case you are using a dedicated monitoring cluster especially if u have a non basic licence

and you should be able to monitor it under stack monitoring in kibana

for logstash.yml

the verification modes that worked for me are full and none for certificate I guess it's not supported yet take a look at this Elasticsearch output plugin | Logstash Reference [8.15] | Elastic

so to avoid using none I have to generate an SSL certificate for my logstash node using its hostname and IP address. for that I used the following commands :

./bin/elasticsearch-certutil cert \
  --name logstash \
  --ca-cert /path/to/ca/cert.crt \
  --ca-key /path/to/ca/private.key \
  --dns your.host.name.here \
  --ip 192.168.1.10 \
  --pem

if your certificate is in pem elasticsearch-ca.pem you can convert your CA elastic-stack-ca.p12 to get cert.crt and private.key

here is the commands I used:

openssl pkcs12 -in elastic-stack-ca.p12 -out cert.crt -clcerts -nokeys

openssl pkcs12 -in elastic-stack-ca.p12 -out private.key -nocerts -nodes

so my logstash.yml file looks like this:

xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: XLo75i-zJblqwd50X9ZF
xpack.monitoring.elasticsearch.hosts: ["https://192.168.1.14:9200", "https://192.168.1.15:9200", "https://192.168.1.16:9200"]
xpack.monitoring.elasticsearch.ssl.certificate_authority: /etc/logstash/certs/elasticsearch-ca.pem
xpack.monitoring.elasticsearch.ssl.verification_mode: full
xpack.monitoring.elasticsearch.ssl.certificate: /etc/logstash/certs/logstash.crt
xpack.monitoring.elasticsearch.ssl.key: /etc/logstash/certs/logstash.key

for the username and password it's better to use the dedicated logstash_system for logstash.yml so I went and reset its default password using:

/usr/share/elasticsearch/bin/elasticsearch-reset-password -u -i logstash_sytem 

the -i for a password u input it otherwise u can use delete it to get a random password
u can ofc do this from kibana UI or from its dev tools too

same thing for metricbeat I used the default remote_monitoring_user that I reseted its password using the same command

/usr/share/elasticsearch/bin/elasticsearch-reset-password -u -i remote_monitoring_user

finally for the pipeline in instead of using the elastic super user I created a role called logstash_writer_role
and then a user called logstash_writer and gave it that role more on that here (how to create it ...) Secure your connection to Elasticsearch | Logstash Reference [8.15] | Elastic

the only this to pay attention to it is in :

"indices": [
    {
      "names": [ "logstash-*" ],

you pipeline's index should start with the same logstash-...
in my case :

output {
  elasticsearch {
    hosts => ["https://192.168.1.14:9200", "https://192.168.1.15:9200", "https://192.168.1.16:9200"]
    index => "logstash-pfsense-syslog-%{+YYYY.MM.dd}"
    user => "logstash_writer"
    password => "password"
    ssl => true
    cacert => "/etc/logstash/certs/elasticsearch-ca.pem"
  }
  stdout { codec => rubydebug }
}

a better way is to use API keys instead of usernames and password where you can limit permissions of the API keys... as mentioned here Secure your connection to Elasticsearch | Logstash Reference [8.15] | Elastic

or encrypt the username and password and call them as variables in the pipeline.conf or logstash.yml but I wasn't able to get it to work properly
Secrets keystore for secure settings | Logstash Reference [8.15] | Elastic
here is the command I used
to create my keystore:

set +o history
export LOGSTASH_KEYSTORE_PASS=mypassword
set -o history
sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash create

and then to add my variables:

sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash add ES_USER
sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash add ES_PWD

and then list them:

sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash list

give the logstash.keystore file the appropriate permissions/user.group
and in the pipeline.conf and yml files use

output {
  elasticsearch {
    ...
    user => "${ES_USER}"
    password => "${ES_PWD}"
    ...
  }
}

here is the the output of the keystone creation :

root@logstash:/etc/logstash# sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash create
Using bundled JDK: /usr/share/logstash/jdk
2024-08-22 01:29:55,998 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2024-08-22 01:29:55,999 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2024-08-22 01:29:55,999 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2024-08-22 01:29:56,000 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties

[2024-08-22T01:29:56,574][INFO ][org.logstash.secret.store.backend.JavaKeyStore] Created Logstash keystore at /etc/logstash/logstash.keystore
Created Logstash keystore at /etc/logstash/logstash.keystore
root@logstash:/etc/logstash# sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash add ES_USER
Using bundled JDK: /usr/share/logstash/jdk
2024-08-22 01:30:14,473 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2024-08-22 01:30:14,474 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2024-08-22 01:30:14,475 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2024-08-22 01:30:14,475 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties

Enter value for ES_USER:
Added 'es_user' to the Logstash keystore.
root@logstash:/etc/logstash# sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash add ES_PWD
Using bundled JDK: /usr/share/logstash/jdk
2024-08-22 01:31:01,325 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2024-08-22 01:31:01,326 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2024-08-22 01:31:01,327 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2024-08-22 01:31:01,327 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties

Enter value for ES_PWD:
Added 'es_pwd' to the Logstash keystore.
root@logstash:/etc/logstash# sudo -E /usr/share/logstash/bin/logstash-keystore --path.settings /etc/logstash list
Using bundled JDK: /usr/share/logstash/jdk
2024-08-22 01:31:51,318 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root"
2024-08-22 01:31:51,319 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root"
2024-08-22 01:31:51,320 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog"
2024-08-22 01:31:51,320 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog"
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties

es_pwd
es_user
root@logstash:/etc/logstash# ls
'${sys:ls.logs}'   certs   conf.d   jvm.options   log4j2.properties   logstash.keystore   logstash-sample.conf   logstash.yml   pipelines.yml   startup.options
root@logstash:/etc/logstash# chown root:logstash logstash.keystore
root@logstash:/etc/logstash# chmod 770 logstash.keystore

and here is the error I got in logstash log journal :

root@logstash:/etc/logstash/conf.d# systemctl restart logstash
root@logstash:/etc/logstash/conf.d# journalctl -u logstash -f
-- Logs begin at Sun 2024-08-18 20:34:53 UTC. --
Aug 22 01:38:00 logstash logstash[886]: }
Aug 22 01:38:05 logstash systemd[1]: Stopping logstash...
Aug 22 01:38:05 logstash logstash[886]: [2024-08-22T01:38:05,217][WARN ][logstash.runner          ] SIGTERM received. Shutting down.
Aug 22 01:38:06 logstash logstash[886]: [2024-08-22T01:38:06,606][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
Aug 22 01:38:07 logstash logstash[886]: [2024-08-22T01:38:07,318][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
Aug 22 01:38:07 logstash logstash[886]: [2024-08-22T01:38:07,352][INFO ][logstash.runner          ] Logstash shut down.
Aug 22 01:38:07 logstash systemd[1]: logstash.service: Succeeded.
Aug 22 01:38:07 logstash systemd[1]: Stopped logstash.
Aug 22 01:38:07 logstash systemd[1]: Started logstash.
Aug 22 01:38:07 logstash logstash[41720]: Using bundled JDK: /usr/share/logstash/jdk
Aug 22 01:38:25 logstash logstash[41720]: Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
Aug 22 01:38:26 logstash logstash[41720]: [2024-08-22T01:38:26,120][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
Aug 22 01:38:26 logstash logstash[41720]: [2024-08-22T01:38:26,123][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.15.0", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
Aug 22 01:38:26 logstash logstash[41720]: [2024-08-22T01:38:26,126][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms4g, -Xmx4g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
Aug 22 01:38:26 logstash logstash[41720]: [2024-08-22T01:38:26,130][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
Aug 22 01:38:26 logstash logstash[41720]: [2024-08-22T01:38:26,130][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
Aug 22 01:38:27 logstash logstash[41720]: [2024-08-22T01:38:27,040][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::LoadException", :message=>"Found a file at /etc/logstash/logstash.keystore, but it is not a valid Logstash keystore.", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:294)", "org.logstash.secret.store.backend.JavaKeyStore.load(JavaKeyStore.java:77)", "org.logstash.secret.store.SecretStoreFactory.doIt(SecretStoreFactory.java:129)", "org.logstash.secret.store.SecretStoreFactory.load(SecretStoreFactory.java:115)", "org.logstash.secret.store.SecretStoreExt.getIfExists(SecretStoreExt.java:60)", "org.logstash.execution.AbstractPipelineExt.getSecretStore(AbstractPipelineExt.java:797)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:238)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:173)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1379)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:363)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:446)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:92)", "org.jruby.RubyClass.newInstance(RubyClass.java:949)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:446)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:92)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:548)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:363)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:476)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:293)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:324)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:118)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:144)", "org.jruby.RubyProc.call(RubyProc.java:354)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:111)", "java.base/java.lang.Thread.run(Thread.java:1583)"]}
Aug 22 01:38:27 logstash logstash[41720]: [2024-08-22T01:38:27,076][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
Aug 22 01:38:27 logstash logstash[41720]: [2024-08-22T01:38:27,085][INFO ][logstash.runner          ] Logstash shut down.
Aug 22 01:38:27 logstash logstash[41720]: [2024-08-22T01:38:27,092][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
Aug 22 01:38:27 logstash logstash[41720]: org.jruby.exceptions.SystemExit: (SystemExit) exit
Aug 22 01:38:27 logstash logstash[41720]:         at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:921) ~[jruby.jar:?]
Aug 22 01:38:27 logstash logstash[41720]:         at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:880) ~[jruby.jar:?]
Aug 22 01:38:27 logstash logstash[41720]:         at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:90) ~[?:?]
Aug 22 01:38:27 logstash systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE
Aug 22 01:38:27 logstash systemd[1]: logstash.service: Failed with result 'exit-code'.
Aug 22 01:38:27 logstash systemd[1]: logstash.service: Scheduled restart job, restart counter is at 1.
Aug 22 01:38:27 logstash systemd[1]: Stopped logstash.
Aug 22 01:38:27 logstash systemd[1]: Started logstash.

thank you for stopping by
if u have a suggestion feel free !

2 Likes