Hi community,
I configured ssl on the whole stack (elasticsearch, kibana, logstash, filebeat)
When I manually launch logstash to test my configuration, I have this return.
I think it's good but I don't see anything in the discover part of kibana...
Could you help me?
Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2021-07-10T19:02:26,373][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-07-10T19:02:26,551][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.13.2", "jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [linux-x86_64]"}
[2021-07-10T19:02:30,462][INFO ][org.reflections.Reflections] Reflections took 243 ms to scan 1 urls, producing 24 keys and 48 values
Configuration OK
[2021-07-10T19:02:31,615][INFO ][logstash.runner ] Using config.test_and_exit mode. Config Validation Result: OK. Exiting Logstash
How are you starting logstash? Are you running it as a service?
I start logstash as a service yes
And what is your configuration that is not working?
All my services run, just in logstash/logstash-plain.log I don't have the same return as when I run it manually. The logs don't tell me that the configuration is ok.
And when i want to check the ssl with a curl i have this return
If you are following the tutorial and running logstash as a service then when you run it on the command line (without -t) I would expect an error telling you port 5044 is already in use. If you are not getting that then the service is probably not running.
What does your logstash configuration file look like?
@Badger Thanks
I have configured a filebeat on the same machine with kibana and logstash.
The filebeat service is running, it says "started Filebeat sends log files to Logstash or directly to Elasticsearch.."
This is my uncommented lines in filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- /etc/filebeat/logstash-tutorial-dataset
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 3
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["node1-ad-it.fr:5044"]
# Optional SSL. By default is off.
# List of root certificates for HTTPS server verifications
ssl.certificate_authorities: ["/etc/filebeat/config/certs/ca.crt"]
# Certificate for SSL client authentication
ssl.certificate: "/etc/logstash/config/certs/logstash.crt"
# Client Certificate Key
ssl.key: "/etc/logstash/config/certs/logstash.pkcs8.key"
#================================ Processors =====================================
# Configure processors to enhance or manipulate events generated by the beat.
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
@Badger I put /etc/filebeat/logstash-tutorial-dataset/* in filebeat.yml it is still not working
When in curl logstash, i have this error, you think it is an issue? (self-signed certificate with certutil)
curl -k --cacert /etc/logstash/config/certs/ca.crt https://node1.ad-it.fr:5044
curl: (35) error:14094412:SSL routines:ssl3_read_bytes:sslv3 alert bad certificate
in the filebeat configuraton. If that works you have confirmed the client certificate is the problem. I would then start checking that you have certificate in the right formats, that the private keys match, etc.
@Badger
My bad i restarted my machine and then i was able to restart the services with thoses lines commented. I still have the same problem (no logs in kibana and bad certificate with curl).
Thanks
@badger thanks
In filebeat logs it is written :
What should I understand?
|2021-07-11T02:03:27.499+0200|INFO|log/harvester.go:255|Harvester started for file: /etc/filebeat/logstash-tutorial-dataset/logstash-tutorial.log|
|---|---|---|---|
|2021-07-11T02:03:27.501+0200|INFO|crawler/crawler.go:106|Loading and starting Inputs completed. Enabled inputs: 1|
|2021-07-11T02:03:27.501+0200|INFO|cfgfile/reload.go:150|Config reloader started|
|2021-07-11T02:03:27.501+0200|INFO|cfgfile/reload.go:205|Loading of config files completed.|
|2021-07-11T02:03:30.490+0200|INFO|add_cloud_metadata/add_cloud_metadata.go:340|add_cloud_metadata: hosting provider type not detected.|
|2021-07-11T02:03:31.490+0200|INFO|pipeline/output.go:95|Connecting to backoff(async(tcp://logstash.ad-it.fr:5044))
2021-07-11T02:03:33.237+0200 ERROR pipeline/output.go:100 Failed to connect to backoff(async(tcp://logstash.ad-it.fr:5044)): x509: certificate signed by unknown authority
2021-07-11T02:03:33.237+0200 INFO pipeline/output.go:93 Attempting to reconnect to backoff(async(tcp://logstash.ad-it.fr:5044)) with 1 reconnect attempt(s)
|
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.