How can I parse Windows Logs?

Hello,

I am a junior system administrator working mainly with windows, getting used to using Linux.
I'm trying to setup the Elastic stack to visualize logs from Windows Services which are part of our production environment.
These logs are looked at manually at the moment, when there is a problem with the service and we need to change that, so that we know there is a problem before the coworkers in the production get affected from it.
So I'm not great in coding and grok filtering seems pretty cryptic to me.
How do I manage to get Logs from WIndows services parsed, so that I have the log message seperated into logical fields, which I can use for visualizing?
As far as I understood it, filebeat only does ship the data without parsing it or getting it into the right format (which is Logstashs task). But Winlogbeat is able to do just that, but only for Windows Event Logs (and there is no possibility to use it for custom text logs on the harddrive).
Does that mean I have to get our applications to log to the Windows Event Viewer, so that I can use winlogbeat to ship and parse those logs, which are then being sent to Elasticsearch?

It would be great if any of you could help me out or point me in the right direction, to get my windows services logs into a readable format (in the best case only with containing relevant information).

I already managed to get it running in an older version, however it just received logs via filebeat which were in the default format with just one message field containing the whole Log. And I couldn't use it for visualising correctly.

Hi,

Have a look at this particular beat: Winlogbeat Overview | Winlogbeat Reference [6.2] | Elastic

Do these application logs not fall under windows application EVT event logs? Is that something that needs to be turned on or then to answer the above is then correct, point the app logs to EVT.

For further enrichment or parsing point winlogbeat to Logstash to make use of the filters and output that to Elasticsearch.

Else if you are content with filebeat, have that send to logstash and output that to Elasticsearch.

Do the Windows Services specify their own logging format and use a library to write to log files?

If they do and the format is one that is regular and parseable, you could use Filebeat in combination with an ingest pipeline in Elasticsearch that uses a grok processor, to send log lines from Filebeat and parse into documents to index into Elasticsearch

1 Like

Yes they do.
Shouldn't I be using Logstash for parsing?
I'll try to get it working with Winlogbeat, if it won't work as hoped, I will have to do it with grok, even though I have a pretty hard time getting into Log Parsing and creating Grok filters, I'm not familiar with the syntax.

That depends. Logstash has a much richer feature set than ingest node in Elasticsearch. In this case however, it sounds like a grok processor ingest pipeline would be sufficient to solve the problem, and would overall be a simpler architecture to manage. If you find there is a need in the future to perform more complex enrichment and filtering, and to receive data from multiple sources and send to multiple different destinations, Logstash could be introduced for this purpose. At that point, you might decide to move grok processing out of the ingest pipeline into Logstash.

WinlogBeat is great for reading Windows event logs. An application may not be writing all application logs to it however, it may write only Service Manager interaction and fatal error events. If the Windows services are written by your company, I'd recommend speaking with the authors to understand the log destinations and format, and use this information to guide your log collection strategy.

Hello.
I recently did a fresh install with the recent Elastic Stack.
I used Winlogbeat to collect some EventLogs from a Service running on this Windows Host.
The logs arriving in Elasticsearch contain the message field which only has this one line as it is supposed to be. Like:

New data input: XJDF_20170504_randomtext_2.zip

When I look into Kibana, it says that the field message is readable, but not aggregatable.
I can not select this field in Kibana to use it for Visualizations.

I am not sure how to proceed.
Do I still need to parse the log?

For a string input to be aggregatable, it's best to map it as a keyword data type. You may map a string input as both text data type and keyword data type, to allow full text search and aggregations, respectively.

I would question what you would like to aggregate on with this string however; what are the important elements

  1. The entire string verbatim?
  2. The file name?
  3. The file extension?

If it's 2 or 3, or some other substring, you may want to send logs through an ingest pipeline to extract those values into separate keyword fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.