SSL from filebeat to Kafka

I could see the below note in the link: https://www.elastic.co/guide/en/beats/filebeat/current/securing-communication-elasticsearch.html

For any given connection, the SSL/TLS certificates must have a subject that matches the value specified for hosts, or the SSL handshake fails. For example, if you specify hosts: ["foobar:9200"], the certificate MUST include foobar in the subject (CN=foobar) or as a subject alternative name (SAN). Make sure the hostname resolves to the correct IP address. If no DNS is available, then you can associate the IP address with your hostname in /etc/hosts (on Unix) or C:\Windows\System32\drivers\etc\hosts (on Windows).

Does this hold good for Kafka as well. If this is so, could you please explain why it will needed?

Thanks

This note is general about SSL/TLS certificates and not specific to Filebeat or Elasticsearch. This is how certificates work. So it is true for Kafka as well.

So it means, if you want to secure the connection to your Kafka server, you have to configure your Kafka output to use SSL. Just like you would do for other outputs.

For example:

output.kafka:
  # [ ...your fields ...]
  ssl.certificate_authorities:
    - /etc/pki/my_root_ca.pem
    - /etc/pki/my_other_ca.pem
  ssl.certificate: "/etc/pki/client.pem"
  ssl.key: "/etc/pki/key.pem"

Here is the docs on how to configure SSL for Kafka output: https://www.elastic.co/guide/en/beats/filebeat/master/kafka-output.html#_literal_ssl_literal_3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.