Basically I have an entire product using log4net and I want to start collecting all these logs in a centralized Elasticsearch. I have done a lot of research for the past two weeks and would like your recommendations too. I have come up with a few approaches that seem feasible:
a. One approach is to install logstash all servers and modify each service to add another log4net appender to send logs to logstash too, then logstash on each individual machine will send this data to elasticsearch.
b. The other approach is to have a centralized logstash and make the log4net appender of each service send logs to a centralized server which then connects to elasticsearch. Approach 'a' seems better than b since the overhead of 'b' will be more on the server leading to a decrease in efficiency.
c. Third approach is to have filebeat running on each server which gathers the logs and sends it to a centralized logstash which then connects to elasticsearch. The advantage of this approach is that we will not need to modify any service/application.
Lastly, I would also like to know if you'll advice running elastic cloud or running elastic on one of our vm's.
I have created a new topic for this mainly because our main requirements here are listed by priority:
- LEAST amount of overhead on each server.
- Making use of log4net so that we dont have to create a filter since every service generates logs differently. (basically something like type => "log4net" in logstash)
- Lesser cost
- Providing real-time analysis on elasticsearch
Sorry for the length of the question. Any advice would be really helpful. THANK YOU.