Logtash to elasticsearch connection error using certificate

Hi All,

I am facing this isssue while connecting elasticsearch with logstash over certificate .
I have mentioned the logstash.yml path in startup.options file is LS_HOME and LS_SETTINGS_DIR . Still it gives me the following error :

Unable to retrieve license information from license server {:message=>"Elasticsearch Unreachable: [https://username:xxxxxx@xyz.uat.abc.com:9200/][Manticore::ClientProtocolException] PKIX path validation failed: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors"}
[2019-10-23T12:37:35,525][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
[2019-10-23T12:38:05,231][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}

[logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://username:xxxxx@xyz.uat.abc.com:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://username:xxxxxx@crisp.uat.viacom18.com:9200/][Manticore::ClientProtocolException] PKIX path validation failed: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors"}

Do i need a license for this ? I am able to connect elasticsearch and kibana over ssl and ca certificate but **facing issues with logstash and elasticsearch.

Below is my logstash.yml file

**xpack.monitoring.enabled: true**
**xpack.monitoring.elasticsearch.username: username**
**x**pack.monitoring.elasticsearch.password: password
xpack.monitoring.elasticsearch.hosts: [ 'https://xyz.uat.abc.com:9200' ]
xpack.monitoring.elasticsearch.ssl.certificate_authority: '/usr/share/logstash/config/certificates/INDUS1-CA.crt'

Where am i going wrong ?

My logstash.conf in pipeline folder looks like this :

output {

stdout { codec => "json" }

     if [level] == 50 {
           elasticsearch {
             index => "%{name}-error-logs"
             cacert => '/usr/share/logstash/config/certificates/INDUS1-CA.crt'
             user => username
             password => password
             hosts => ["https://xyz.uat.abc.com:9200"]
           }
        }
}

Please don't post unformatted code, logs, or configuration as it's very hard to read.

Instead, paste the text and format it with </> icon or pairs of triple backticks (```), and check the preview window to make sure it's properly formatted before posting it. This makes it more likely that your question will receive a useful answer.

It would be great if you could update your post to solve this.

hey @ikakavas !! Done . Thanks for letting me know :slight_smile:

Based on the error, this looks like an SSL certificate issue. Have you tried using openssl to verify that you can connect successfully to Elasticsearch with the certificate in question? It would be something like:

echo -n | openssl s_client -CAfile /usr/share/logstash/config/certificates/INDUS1-CA.crt -connect xyz.uat.abc.com:9200

Logstash output plugin has no way of disabling hostname verification, so you need to make sure that

  • The hostname that you use in the output plugin configuration (xyz.uat.abc.com ) is included as a SAN in the certificate that elasticsearch uses for TLS on the http layer
  • The CA certificate that you reference with INDUS1-CA.crt is the actual CA certificate that has signed the certificate that elasticsearch uses for TLS on the http layer, by running the command @Mike_Place shared above

I am running elk instances using docker , i tried running this command inside the container, but it gives me an error saying openssl : command not found.

Any other alternative ?

@ikakavas , certificates are same both in elasticsearch and logstash config. I did not understand your first statement. no way of disabling hostname verification.

Tried configuring logstash.yml in different ways , but everytime I get the same error. Do i need to change any configurations in elasticsearch ? But I am a little skeptic as I said elasticseach and kibana are working perfectly.

You can run the command outside the container, from anywhere that your Elasticsearch instance is reachable.

certificates are same both in elasticsearch and logstash config.

Not sure what you mean with this, maybe you can share your elasticsearch configuration too.

Tried configuring logstash.yml in different ways , but everytime I get the same error.

I can understand frustration, but this doesn't help us to help you much as we can't know what you tried and what didn't work or why.

Hostname verification means that when the client connects to a server over SSL, it verifies that the hostname it connects to is actually included in the SSL Certificate the server presents, in a section that is called Subject Alternate Names. Kibana can be configured to not check this, but logstash output plugin can't, this is what my statement meant to convey.

If you can share the configuration of elasticsearch and the configuration of kibana that works, then we can probably help you better understand what your issue is and how to overcome it.

Any other alternative ?

Copying the certificate file to a host that does have the openssl binary installed and which can connect to the ES cluster in question would also work.

Thanks for your explanation @ikakavas . I looked up to the certificate in elaticsearch. It has subject as :

subject=/CN=*.uat.abc.com

@Mike_Place and @ikakavas
This, makes me question that do we also need to configure ssl certificates in logstash ? currently I am only using ca-cert.

Below is my elasticsearch.yml file

cluster.name: "elk-cluster"
node.name: node-uat-01
discovery.type: single-node
network.host: 0.0.0.0
xpack.security.enabled: true
xpack.security.http.ssl.enabled: true
xpack.security.transport.ssl.enabled: true
xpack.security.http.ssl.verification_mode: certificate
xpack.security.http.ssl.key: certificates/uat_pem.key
xpack.security.http.ssl.certificate: certificates/uat_public.crt
xpack.security.http.ssl.certificate_authorities: certificates/INDUS1-CA.crt
xpack.security.transport.ssl.key: certificates/uat_pem.key
xpack.security.transport.ssl.certificate: certificates/uat_public.crt
xpack.security.transport.ssl.certificate_authorities: certificates/INDUS1-CA.crt

kibana.yml :

server.name: elk-kibana
server.host: "0"
#xpack.monitoring.ui.container.elasticsearch.enabled: true
server.ssl.enabled: true
server.ssl.certificate: /usr/share/kibana/config/certificates/uat_public.crt
server.ssl.key: /usr/share/kibana/config/certificates/uat_pem.key
elasticsearch.hosts: ["https://xyz.uat.abc.com:9200"]
elasticsearch.username: "username"
elasticsearch.password: "password"
#elasticsearch.ssl.verificationMode: none
elasticsearch.ssl.certificateAuthorities: /usr/share/kibana/config/certificates/INDUS1-CA.pem

@Mike_Place Yes did that , received an error saying :

CONNECTED(00000003)
depth=0 CN = *.uat.abc.com
verify error:num=20:unable to get local issuer certificate
verify return:1

....


No client certificate CA names sent
Peer signing digest: SHA512
Server Temp Key: ECDH, P-256, 256 bits

Is this certificate self-signed or is there a CA bundle you have installed as well?

@Mike_Place It is self signed.

BTW xpack.security is chargeable ? i am using 7.2.0 version.

Does it connect if you set xpack.monitoring.elasticsearch.ssl.verification_mode: none in logstash.yml?

Doing this still gives me an error saying :

[logstash.outputs.elasticsearch] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"https://username:xxxx@xyz.uat.abc.com:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [https://username:xxxx@xyz.uat.abc.com:9200/][Manticore::ClientProtocolException] PKIX path validation failed: java.security.cert.CertPathValidatorException: Path does not chain with any of the trust anchors"}

with following configuration in my logstash.yml :

output {

stdout { codec => "json" }

     if [name] == "xyz-api" and [level] == 50 {
            elasticsearch {
             index => "xyz-api-error-logs"
             cacert => '/usr/share/logstash/config/certificates/INDUS1-CA.crt'
             user => username
             password => password
             hosts => ["https://xyz.uat.abc.com:9200"]
           }
     }
}

but when I add following lines to my logstash.yml :

ssl => true 
certificate => '/usr/share/logstash/config/certificates/uat_public.crt'
key => '/usr/share/logstash/config/certificates/uat_pem.key'

it gets connected but gives some error and logstash shuts down.

OK. What is the error? :slight_smile:

Error :

[ERROR][logstash.agent           ] Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main, 
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 31, column 17
(byte 890) after output {\n\nstdout { codec => \"json\" }\n     \n     
if [name] == \"xyz-api\" and [level] == 50 {\n  elasticsearch {\n   index => \"xyz-api-error-logs\"\n             ssl_certificate_verification => true\n   
cacert => '/usr/share/logstash/config/certificates/INDUS1-CA.crt'\n#            sniffing => false\n             user => username\n             password => password\n             hosts => [\"https://xyz.uat.abc.com:9200\"]\n    
ssl", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in `compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in `compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in `block in compile_sources'", "org/jruby/RubyArray.java:2577:in `map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in `compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:24:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}

while shutting down it displays that it could find the es instance

[INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://username:xxxxx@xyz.uat.abc.com:9200/]}}
[WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://username:xxxxx@xyz.uat.abc.com:9200/"}
[INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://xyz.uat.abc.com:9200"]}
[INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, :thread=>"#<Thread:0x62c9f33e run>"}
[INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[ERROR][logstash.inputs.metrics  ] Failed to create monitoring event {:message=>"For path: http_address. Map keys: [:stats, :os, :jvm]", :error=>"LogStash::Instrument::MetricStore::MetricNotFound"}
[INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[WARN ][org.logstash.execution.ShutdownWatcherExt] {"inflight_count"=>0, "stalling_threads_info"=>{"other"=>[{"thread_id"=>30, "name"=>"[.monitoring-logstash]>worker0", "current_call"=>"[...]/logstash-core/lib/logstash/java_pipeline.rb:239:in `block in start_workers'"}]}}
[ERROR][org.logstash.execution.ShutdownWatcherExt] The shutdown process appears to be stalled due to busy or blocked plugins. Check the logs for more information.
[INFO ][logstash.javapipeline    ] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[INFO ][logstash.runner          ] Logstash shut down.

You should not add this. This controls the client key and certificate that Logstash would use to connect to Elasticsearch in order to perform client TLS authentication. This is not required, Elasticsearch is not setup for this and this is irrelevant to the problems you are facing.

Yes, this is the subject, but how about the SANs of the certificate ? Is the hostname defined as one there too ?

Also, does kibana connect successfully to elasticsearch with the exact configuration you have shared, or did you comment/uncomment anything ? Looking at this #elasticsearch.ssl.verificationMode: none , is it also commented out and Kibana connects just fine ?

Correct I understood the same . Not adding those anywhere in my logstash configuration.
Yes kibana and elasticsearch works fine and are connected to each other with those configuration i mentioned above ( kibana.yml and elasticsearch.yml).

my certificate looks like this . This is not ca-cert .

Bag Attributes
    localKeyID: 01 00 00 00
    friendlyName: UAT.abc.com
subject=/CN=*.uat.abc.com
issuer=/DC=com/DC=abc/CN=abc-INDUS1-CA
-----BEGIN CERTIFICATE-----

This is not enough unfortunately. Can you run

openssl x509 -in <path/to/your/cert/here> -text -noout

and share the output ?