we are using ELK stack version 7.9.0 , when we try to connect logstash with elastic it give us error .
'client requested protocol tlsv1 is not enabled or supported in server context'
when we specify following parameter in elasticsearch.yml file it works fine.
Given that this issue is still unresolved, I don't think there is a way to configure this in logstash's elasticsearch output plugin at the moment unfortunately.
You can configure your JVM to disallow TLS1.0 and TLS1.1 , thus forcing Logstash to use TLSv1.2. Depending on how you have installed logstash and which JVM you are using ( see https://www.elastic.co/guide/en/logstash/current/getting-started-with-logstash.html#ls-jvm ) , you need to locate the java.security file within the jdk dir. In there there is a jdk.tls.disabledAlgorithms= line to which you can add TLSv1, TLSv1.1 .
Thanks loannis Kakavas for your immediate reply , much appreciated.
after adding [TLSv1, TLSv2] to java.security we are getting this.
'No appropriate protocol, may be no appropriate cipher suite specified or protocols are deactivated"}'
Moreover we have another instance of ELK 7.9.0(on another machine) where we didn't need to specify
xpack.security.http.ssl.supported_protocols: [ "TLSv1.2", "TLSv1.1", "TLSv1" ]
and connection of logstash to elasticsearch went through fine.
We still couldn't figure that why it is needed on one instance while other one is doing fine without it.
Mind you the jdk.tls.disabledAlgorithms= controls the versions that need to be disabled on the JVM level. It's not the same as supported_protocols that dictates the versions that should be enabled.
In that sense adding both [TLSv1, TLSv2] tojdk.tls.disabledAlgorithms=, makes no sense. You should add only TLSv1 ( i.e. disable only TLSv1 )
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.