Seperate filebeats for different groups of servers


I am very new to ELK, please help me to clarify something here. So I have installed an instace of ELK all in one server for testing. I could ship the log to logstash from my servers using filebeat. I also enable trial license to test the Security feature. Now the question is:
There are 3 different types of servers which logs can be accessed/discovered only via there different groups of users. How can I enable this so that let's say database people should not be able to see application log but only database logs?

Your idea is highly appreciated!

so from your message, what I understood is
=> you have Beats collecting data in endpoints/devices/hosts
=> then sending them directly to elasticsearch? (hope you are not using logstash in between?)
=> you need to allow Roles/permissions only for certain indexes ?

You need to design your setup properly. Some of the tips would be

  • Ensure the beats send the data specific to the index. (you could see how you can filter to individual indexes)
  • Design your index naming convention in systematic manner (eg my_os_windows_yyyymmdd, my_db_oracle_yyyymmdd) etc.
  • Create index pattern of my_db* and give roles/permission access to those users
  • Also you could create spaces in Kibana to isolate users/activities etc.

Hi Kelk,
Thank you for your comment again here in this topic.
You understood correctly my message. And yes, there is logstash in between. Is that bad? I guess it plays its role in ELK, right?
Could you please send me a knowledge page for your first tip: "Ensure the beats send the data specific to the index"? Did you mean that the beat can ship the log to different index name? My understanding is that filebeat control the index name and we can not interfere but seemingly my understand is not correct. My all indexes end up with the name filebeat_xx_yy_zz.

you can have logstash in between. You can have more control on your data if you have logstash in between.

Of course, you can change index names coming from filebeat especially if you have logstash.

  • Just tag the dataset in filebeat (inputs.d) or fields (eg. .. my_db_oracle)
  • in Logstash, just do the logic based on tag or fields and redirect to relevant index (eg: index on condition)

thanks for your help. Exactly what I need!