I did some research and found that Logstash offers mutual SSL authentication so I can once authenticate servers on which filebeat runs and also authenticate the Logstash server itself.
Our future centralized logging solution would require complete security isolation between the data coming from different projects.
In a nutshell, I would like filebeat agents shipping data from a particular application to access my ES secured cluster in such a way that access should be granted based on a client certificate of the filebeat server. In other words, I am trying to avoid a scenario when a filebeat agent from app1 starts loading data into elasticsearch indices reserved for app2.
Since all the logs are going through Logstash first Is it possible with X-Pack to bind access to write into specific indices depending on the client certificate of the original source where filebeat is collecting logs from?
If I consider having a separate logstash instance running on each client server together with the filebeat shipper, do you think that I will be able to differentiate output from those logstash instances to a centralized ES instance with client certificates so that logstash from app1 will not have access to indices reserved for app2 and vice-versa?
You mention both beats and logstash. Do you have beats connecting directly to ES, or is it going via LS?
From your most recent post, I assume it is the latter.
What you want to do should work fine, as long as you have an LS pipeline for each app.
You'd have filebeat talking to LS over SSL with client certificates (ssl_verify_mode: force_peer) and then pipe that output to Elasticsearch over SSL with a new client certificate, then you can assign different ES roles to each Logstash pipeline.
The ES PKI realm assigns a distinct user principal based on the Distinguished Name of the SSL certificate. You map those usernames/DNs to specific roles for each user.
Do you have beats connecting directly to ES, or is it going via LS?
From your most recent post, I assume it is the latter.
Yes, it's the latter: beats are connecting to LS first and then ES.
It seems that the only way to go would be having a separate Logstash instance for each app and setting up a PKI realm and client certificates.
One last question:
Can I have multiple different realms at the same time in the elasticsearch.yml configuration: an AD realm to authenticate users and a PKI realm to accommodate the advanced security scenario with client certificates for the app servers?
@TimV, I am having trouble using/finding any client certificate options for the Logstash Elasticsearch output plugin.
Does the plugin support client certificate or I need to use the http plugin where we have client_cert & client_key?
When I tried to use client_cert & client_key with the elasticsearch output plugin like I get:
[2017-09-11T14:07:49,764][ERROR][logstash.outputs.elasticsearch] Unknown setting
'client_cert' for elasticsearch
[2017-09-11T14:07:49,769][ERROR][logstash.outputs.elasticsearch] Unknown setting
'client_key' for elasticsearch
[2017-09-11T14:07:49,783][ERROR][logstash.agent ] Cannot create pipeli
ne {:reason=>"Something is wrong with your configuration."}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.