I have some basic queries related to ELK
- How much CPU and memory is utilized by (filebeat, winlogbeat, metricbeat)beats for shipping logs from client machines ? Can that be controlled in any manner
- On an average how much disk space would be required to store (logs)data with the retention period of 30 day for 400 servers(300 Windows + 100 Linux). Just need an average idea.
Actually, there is an agreement with my client to keep backup of the system logs.
If there is any document that leads to these queries, please let me know the same.
Thank You
I would say this depends a lot on how much work they have got to do, so your best bet is to test it.
That will depend on how much data is generated on these servers as well as to what extent you have optimised you index settings and mappings, and as this is something that tend vary a lot from use-case to use-case your best bet is probably to test here as well.
Ok, every system will have its own usages. That's fine
Also, can let me know what are the risks involved using ELK and beats. Like, as we install beat agents on client machine and then data is sent by respective beats. So what is the risk involved in this. Would there be any risk after installing these agents, or when data is being sent in json format.
If these beats would be installed with non-admin users, will the risk be lowered ?
That is unfortunately something I do not think I will be able to help with, but maybe someone else have some inputs.
Is there any link or document, that can help ?