I'm hoping that someone can provide some advice or experience. So far I have a proof of concept working, analysing a single web server.
In production we have multiple servers (web, application, DB's) as well as multiple OS's (linux, centos, Windows) that I want to analyse. At the moment it is only my team looking at the data as an debug tool, but at a later stage I might also want to restrict/grant access to certain dashboards in kabana to other departments. What would be the best way of setting up an ELK stack for this? So far my ideas are:
- one index for the lot and try to get logstash to map to the same terms where possible
- An index for each server (not quite sure how to do this)
- Some other method I've not thought of.
Occasionally, I'd like to compare certain stats between servers (to trace the actions of a singe IP through the system, or to compare loads between web frontends, app servers and DBs), but I'm not sure that this is possible if I use separate indexes?
I'm after the pros/cons of each and thoughts on if I'm on the right track or completely missing the point of how to set this up at scale.
thanks for your input.