Windows NPS Logs

Hello everyone,

I've been looking for a filebeat module for NPS Logs but there doesn't appear to be one available.

What are my options here?

I have winlogbeat which can get stuff from event log but in C:\Windows\Logs there are much more detailed logs files

Filebeat is the way to go here?


If they are plain text log files then yes, Filebeat is the way to go. Can you post some sample logs?

Or the new Elastic Agent :slight_smile:

Thanks for your replies :slight_smile:

I capture event log 6272 and 6273 with winlogbeat

Sure @legoguy1000

<Event><Timestamp data_type="4">11/12/2021 09:13:09.669</Timestamp><Computer-Name data_type="1">SPDC2</Computer-Name><Event-Source data_type="1">IAS</Event-Source><Acct-Status-Type data_type="0">2</Acct-Status-Type><NAS-IP-Address data_type="3"></NAS-IP-Address><User-Name data_type="1">host/device.domain.local</User-Name><NAS-Port data_type="0">0</NAS-Port><NAS-Port-Type data_type="0">19</NAS-Port-Type><Calling-Station-Id data_type="1">xxxx18d2b53b</Calling-Station-Id><Called-Station-Id data_type="1">xxxx12cd9d2a</Called-Station-Id><Framed-IP-Address data_type="3"></Framed-IP-Address><Acct-Multi-Session-Id data_type="1">xxx18D2B53B-1636662020</Acct-Multi-Session-Id><Acct-Session-Id data_type="1">348A1259D2B2-18CC18D2B53B-618D9548-E2ABB</Acct-Session-Id><Acct-Delay-Time data_type="0">0</Acct-Delay-Time><Vendor-Specific data_type="2">000039E70508454357494649</Vendor-Specific><Vendor-Specific data_type="2">000039E706124450204C6561726E20496E6F76617465</Vendor-Specific><Vendor-Specific data_type="2">000039E702060000005C</Vendor-Specific><Class data_type="1">311 1 11/01/2021 23:33:08 418818</Class><Vendor-Specific data_type="2">000039E70C0857696E203130</Vendor-Specific><Acct-Input-Octets data_type="0">35368</Acct-Input-Octets><Acct-Output-Octets data_type="0">2632367</Acct-Output-Octets><Acct-Input-Packets data_type="0">916</Acct-Input-Packets><Acct-Output-Packets data_type="0">4247</Acct-Output-Packets><Acct-Input-Gigawords data_type="0">0</Acct-Input-Gigawords><Acct-Output-Gigawords data_type="0">0</Acct-Output-Gigawords><Acct-Terminate-Cause data_type="0">3</Acct-Terminate-Cause><Acct-Session-Time data_type="0">45</Acct-Session-Time><Service-Type data_type="0">1</Service-Type><NAS-Identifier data_type="1">ECWIFI</NAS-Identifier><Client-IP-Address data_type="3"></Client-IP-Address><Client-Vendor data_type="0">0</Client-Vendor><Client-Friendly-Name data_type="1">ArubaAP</Client-Friendly-Name><Proxy-Policy-Name data_type="1">ArubaAP</Proxy-Policy-Name><Packet-Type data_type="0">4</Packet-Type><Reason-Code data_type="0">0</Reason-Code></Event>

Are you advising users to ditch filebeat and winlogbeat in favor of elastic agent?

You can compare the capabilities here

Agent supports both windows and log input.

But I think perhaps if you want to decode XML you will still need to use filebeat with the xml_decode processor

Unless you can get to that via windows integration.

Thanks stephenb

I guess I have a lot of reading in front of me.

Sorry, I have one more question if that's okay

If these logs already have a timestamp how do I force filebeat to use it instead of the @timestamp field?


"NDDC2","IAS",12/01/2020,09:24:02,11,,"domain.local/DOMAIN/Students BYOD/2023/Name <mark>Surname</mark>",,,,,,,,0,"","Hi6... (OUTPUT Truncated)

You could try the timestamp processor

Or set that in your ingest pipeline if you build one to parse those logs.

Thanks stephenb

If I'm getting this right, I can't use the timestamp processor unless I create a custom pipeline first or decode the xml, currently all I get from filebeat output is the message, no fields are parsed, so I can't specify a (timestamp) field to parse and delete.

I just have to work out how to create this pipeline or how to decode :confused:

Adding the following to the filebeat.yml processors section doesn't seem to work

 - decode_xml:
      field: message
      target_field: "xml"
      overwrite_keys: true
      ignore_missing: true
      ignore_failure: true

Sorry stephenb, is this doable from filebeat.yml or do I need to send to logstash first

U should be able to use Filebeat processors or logstash to parse xml and/or Elasticsearch ingest pipeline to manipulate the data. Elasticsearch doesn't have a xml processor which is why that had to be done in Filebeat or logstash. Everything else could be done in which ever u prefer.

I was going insane thinking xml decode wasn't working but it turns out the old logs were not in XML format

So before the upgrade the logs were in a ODBC (Legacy) format
and the new (XML) are the DTS Compliant format

Thank you both for your help, so many options to take now

I've also found this for logstash

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.