How to configure logstash input to accept https post from a third party service

Hi
I have a third party service which posts json data to our logstash instance. It is working using http, now I want to harden the connection using https. The third-party service ssl certificate was already registered in my linux server (/etc/ssl/certs).

In my understanding my logstash instance acts as a browser does. So it should handle the ssl connection when I give it a valid system certificate store, which is located in /etc/ssl/certs.

My input side looks like this

input {
    http {
        ssl => true
        ssl_verify_mode => "peer"
        ssl_certificate_authorities => ["/etc/ssl/certs"]
        port => 8443
    }
} 

When I run the logstash the Error "<LogStash::ConfigurationError: Certificate or JKS must be configured" shows up. I don't have a key and crt file from the Server which is sending the data, I could organize the .pem file. In the /etc/ssl/certs the servers-used certificate is already registered. I can trust. When I surf to this webpage I have to 'accept' the ssl neither, because it is in the system certification store.

As described here I should make use of certificate and key:
Check the documentation . "You can enable encryption by setting ssl to true and configuring the ssl_certificate and ssl_key options."

This input helped neither: https://www.elastic.co/de/blog/tls-elastic-stack-elasticsearch-kibana-logstash-filebeat

Is it necessary to have a certificate which is global trusted to get that connection running?

Lukas

Solution was:
put it behind a nginx proxy which handles the ssl certification. Had to register a subdomain loginput.company.com which is pointing to the fix ip of the nginx proxy. the nginx proxy just forwards it and the input part gets easy:

input {
  http {
      port => <dest-port>
  }
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.