Import Saved Windows Event Logs

Hello,

I have .evtx logs saved to CD/DVDs that I would like elasticsearch to ingest. Is there a way to change the path that winlogbeat uses to check for logs? I've tried converting them to .csv files and utilizing filebeat to send to elasticsearch, which works (sort of). But not all of the fields are parsed. I'd rather not have to write a template for this. Help? Thanks in advance.

It's not possible to use Winlogbeat for this. Though it would be great feature.

Thanks for the info. Are you aware of any documentation related to accomplishing this via filebeat ?

No, there's nothing in Filebeat for this either.

If you wanted to do some Go development I can potentially see a path that reuses the Winlogbeat code. You could export the records from the .evtx file to XML using the tools in windows. Then write a custom processor (similar to the decode_json_fields processor (source) that uses the Winlogbeat code to parse the XML. Then read the XML log lines using Filebeat and enable your custom eventlog xml processor. The config might look something like:

filebeat.prospectors:
  - paths: ['eventlog.xml']
processors:
- decode_eventlog_xml: {}
output.elasticsearch.hosts: ["http://localhost:9200"]

A solutions engineer provided me with the following:

  1. file.output --> Elasticsearch
    After installing Winlogbeat on the machines producing the logs contained on my CD/DVDs ...
    ...Using this approach, you would be able to pull JSON-formatted. logs from isolated machines, move them to a machine with connectivity to ES, and the use Filebeat or Logstash to read those logs and push them to ES.
    https://www.elastic.co/guide/en/beats/winlogbeat/current/file-output.html

If installing Winlogbeat is not possible, then there are a couple of options, all of which involve converting the .evtx files into something else:

  1. Custom parser --> ES
    You could write a custom parser that would use a third party lib to read the .evtx files, build the necessary json and post to ES over the HTTP API. I'd recommend utilizing the '_bulk' endpoint.
    https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html
    https://github.com/plutonbacon/evtx.rb
    https://github.com/williballenthin/python-evtx

  2. Apache NiFi --> ES
    NiFi has built-in support for .evtx files. You can use it to build a pipeline that takes the .evtx logs from the filesystem, parses out the data, builds the necessary json, and posts it to Elasticsearch.
    https://www.community.hortonworks.com/articles/58493/parsing-evtx-files-with-apache-nifi.html

Logstash may play a useful role in any of these scenarios.

1 Like

This topic was automatically closed after 21 days. New replies are no longer allowed.