Dear All,
I have 10 servers with different application and different clients. I want to setup ELK Stack server for all of them in one place. The server will be sending following logs to my ELK Box
Apache Access & Error logs
SSH access and Failed logs
Mail Logs (Postfix / Exim)
Mysql Logs / Slow query
System Logs ( CPU / RAM /DISK / LOAD AVERAGE)
Firewall Logs
IDS logs
clam av logs
Maldet logs
Aide logs
Now my questions are
How to differentiate the logs by host ?
For 1 single server I will put the filters in the say server1.conf file and parse them will it work ?
How to see the data on Kibana dashboard for each server is it like creating different dashboards and visualization ?
Please help
It depends on what you are using to ship the logs. If you are using filebeat, there is a field which has that value by default.
For 1 single server I will put the filters in the say server1.conf file and parse them will it work ?
I don't understand the question.
How to see the data on Kibana dashboard for each server is it like creating different dashboards and visualization ?
Filebeat comes with some default dashboards but anyway, if the server is part of the document indexed in elasticsearch, then it's easy to add a filter on whatever server in Kibana.
For 1 single server I will put the filters in the say server1.conf file and parse them will it work ?
So for example I have srv1.example.com and I want to parse logs from that server
Apache Logs
Firewall Logs
SSH auth logs
System Logs - CPU / MEMORY etc
and I create a file with the hostname srv1.example.conf and put the filters in one file for all of the above
will it work ?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.