FSCrawler && SSL && SANs

This is not related. You can ignore this warning.

Are you still using the SNAPSHOT version?
This message should not be in the SNAPSHOT version I think.

Nope. This is form 'standard' version.

Could you try the latest SNAPSHOT instead. The 2.9 is super old actually.

Hi, I have the same issue. Build from grom git yesterday. Seems like ssl_verification is ignored.

Can not execute GET https://elasticsearch:9200/ : No subject alternative DNS name matching crawler-elasticsearch found.

My certificates name of url is different, because i am running in container and name changes on each startup.

Is seems like this issue is just when name of url and certificate does not match. On a local installation, where name matches, but certificate is not trusted, it works.

Hi, I think I fixed is with adding this to ElasticsearchClient (starting in line 168).

                HttpsURLConnection.setDefaultSSLSocketFactory(sslContext.getSocketFactory());
                HostnameVerifier hv = new HostnameVerifier() {
                    public boolean verify(String urlHostName, SSLSession session) {
                        if (!urlHostName.equalsIgnoreCase(session.getPeerHost())) {
                            System.out.println("Warning: URL host '" + urlHostName + "' is different to SSLSession host '" + session.getPeerHost() + "'.");
                        }
                        return true;
                    }
                };
                HttpsURLConnection.setDefaultHostnameVerifier(hv);

don't know how to contribute, so I send it in this was. Please merge it into your code, if you are fine with it. This now not only checkt if a cert is valid, it also ignores when cn is not matching.

1 Like

Thanks a lot @macb74.

Could you tell me how I can reproduce the problem so I can make sure that your proposal for a fix actually fixes the problem without side effects?

Based on your proposal, I opened:

Your code right now, just ignores a untrusted certificate, when ssl_verification is set to false.

My additional code now also ignores if the name of the url in the certificete does not match the url elasticsearch is listening on.

e.g. the SSL Certificate is for cn: my-first-elasticsearch.mydomain.com and my URL for Elasticsearch is https://my-second-elasticsearch.mydomain.com. You see, the installed certificate at elasticsearch host does not match to the hosts name.

In my environment this happened, because I started elasticsearch in docker container first time. By autoconfiguration the ssl certificates have been generated. Now the name of host changed and certificate does not longer match hostname. It is not a good idea for production, but happens in my devenvironment from time to time.

Hello,
Could you please test your code with --restart switch?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.