Ingesting Windows events forwarded by Splunk heavy forwarders

Hiya

We are currently moving our SIEM from Splunk to Elastic.

Due to a tight deadline and network/firewall configuration we will be adding the Elastic endpoint to our current Splunk Heavy Forwarders.

This approach works fine for syslog sources but I am unsure how to handle windows events coming from the heavy forwarders.

I doesn't seem like Winlogbeat or Elastic agent has an Input for this scenario.

Any advice would be appreciated.

1 Like

It's been a while since I looked at this approach, but they push things via HTTP right? If so you could setup an ingest pipeline in Elasticsearch and then point the forwarder to it.

Could I set it up to go via Logstash? The heavy forwarders won't have direct access to the elastic ingest ( although I might be able to arranged that )

In terms of the ingest pipeline - is there a way to get the ECS mappings so they are inline with winlogbeat?

Thanks for the help!

You can configure a http input in Logstash to receive the data from the heavy forwarders.

If you are going to use Logstash there is no need to use an ingest pipeline in Elasticsearch, you can parse the messages directly in Logstash, which is easier and has more flexibility than on ingest pipelines.

Ah of course.

I am pretty familiar with parsing syslog messages from various Linux boxes and Network devices in Logstash, I guess my apprehension comes from my lack of deep understanding of Windows Event logs.

My last concern is getting the logs in ECS format so that the built in SIEM alerts will work.

This will give you some work, you will need to parse the message and rename the fields in the same way Elastic is doing with Winlogbeat or Elastic Agent, you will also need to classify the events based on the event id for them to show up correctly in the SIEM interface and on the alerts.

The parse of the XML document of Windows Event Logs is done directly on the code of the collector, winlogbeat or elastic agent, it uses the decode_xml_wineventlog processor, which you can find the code here, this could give you some hints about how you need to name the fields.

The classification of the events is done using an script on an ingest pipeline that runs in elasticsearch, you would need to apply the same conditionals and create the fields with the same values.

For example, for the security events, this ingest pipeline is executed and you have something like this:

        "4624":
          category:
            - authentication
          type:
            - start
          action: logged-in

This means that for the event with the code 4624 you would need to create the following fields:

{
  "event": {
    "category": "authentication",
    "type": "start",
    "action": "logged-in"
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.