I am a junior system administrator working mainly with windows, getting used to using Linux.
I'm trying to setup the Elastic stack to visualize logs from Windows Services which are part of our production environment.
These logs are looked at manually at the moment, when there is a problem with the service and we need to change that, so that we know there is a problem before the coworkers in the production get affected from it.
So I'm not great in coding and grok filtering seems pretty cryptic to me.
How do I manage to get Logs from WIndows services parsed, so that I have the log message seperated into logical fields, which I can use for visualizing?
As far as I understood it, filebeat only does ship the data without parsing it or getting it into the right format (which is Logstashs task). But Winlogbeat is able to do just that, but only for Windows Event Logs (and there is no possibility to use it for custom text logs on the harddrive).
Does that mean I have to get our applications to log to the Windows Event Viewer, so that I can use winlogbeat to ship and parse those logs, which are then being sent to Elasticsearch?
It would be great if any of you could help me out or point me in the right direction, to get my windows services logs into a readable format (in the best case only with containing relevant information).
I already managed to get it running in an older version, however it just received logs via filebeat which were in the default format with just one message field containing the whole Log. And I couldn't use it for visualising correctly.