Greetings all. I'm starting down a path of implementing ELK + NXLog for various reasons. Our current use case involves wanting to monitor:
- Certain Windows event log events (Really ASP.NET related ones, but I haven't figured out how to be that specific yet).
- IIS logs
- Various web application log files (many different apps, in many different environments, etc.).
I'm struggling to find examples of the best approach here...
My thought was to have each server send its logs via NXLog to our central logstash server. I'm ok with that except for one question: Is it better to run the logstash server with one input on one port that ALL of these logs get shipped to, or is it better to run multiple inputs on different ports and have each of those types of logs get separated later via a type field, etc.?
Then on the outputs, am I better off putting ALL of these logs into one elasticsearch index, or splitting them up, one index per type? Should I do a single index for ALL application logs, or separate those off into different indexes? If some of these logs will be in a more service oriented architecture, so transactions can bounce between multiple apps, would that change your answer for finding, for instance, a chain of logs with a "transactionId" on them?
Depending on the answer to that, what would the Kibana setups look like to allow for cross index queries?
I'll take pointers in Google Fu if I'm just missing where to find some of these answers!