VamPikmin
(Vam Pikmin)
November 13, 2021, 10:20am
1
Hello everyone,
I've been looking for a filebeat module for NPS Logs but there doesn't appear to be one available.
What are my options here?
I have winlogbeat which can get stuff from event log but in C:\Windows\Logs there are much more detailed logs files
Filebeat is the way to go here?
Thanks
If they are plain text log files then yes, Filebeat is the way to go. Can you post some sample logs?
VamPikmin
(Vam Pikmin)
November 14, 2021, 4:46am
4
Thanks for your replies
I capture event log 6272 and 6273 with winlogbeat
Sure @legoguy1000
<Event><Timestamp data_type="4">11/12/2021 09:13:09.669</Timestamp><Computer-Name data_type="1">SPDC2</Computer-Name><Event-Source data_type="1">IAS</Event-Source><Acct-Status-Type data_type="0">2</Acct-Status-Type><NAS-IP-Address data_type="3">192.168.6.2</NAS-IP-Address><User-Name data_type="1">host/device.domain.local</User-Name><NAS-Port data_type="0">0</NAS-Port><NAS-Port-Type data_type="0">19</NAS-Port-Type><Calling-Station-Id data_type="1">xxxx18d2b53b</Calling-Station-Id><Called-Station-Id data_type="1">xxxx12cd9d2a</Called-Station-Id><Framed-IP-Address data_type="3">192.168.1.12</Framed-IP-Address><Acct-Multi-Session-Id data_type="1">xxx18D2B53B-1636662020</Acct-Multi-Session-Id><Acct-Session-Id data_type="1">348A1259D2B2-18CC18D2B53B-618D9548-E2ABB</Acct-Session-Id><Acct-Delay-Time data_type="0">0</Acct-Delay-Time><Vendor-Specific data_type="2">000039E70508454357494649</Vendor-Specific><Vendor-Specific data_type="2">000039E706124450204C6561726E20496E6F76617465</Vendor-Specific><Vendor-Specific data_type="2">000039E702060000005C</Vendor-Specific><Class data_type="1">311 1 192.168.101.14 11/01/2021 23:33:08 418818</Class><Vendor-Specific data_type="2">000039E70C0857696E203130</Vendor-Specific><Acct-Input-Octets data_type="0">35368</Acct-Input-Octets><Acct-Output-Octets data_type="0">2632367</Acct-Output-Octets><Acct-Input-Packets data_type="0">916</Acct-Input-Packets><Acct-Output-Packets data_type="0">4247</Acct-Output-Packets><Acct-Input-Gigawords data_type="0">0</Acct-Input-Gigawords><Acct-Output-Gigawords data_type="0">0</Acct-Output-Gigawords><Acct-Terminate-Cause data_type="0">3</Acct-Terminate-Cause><Acct-Session-Time data_type="0">45</Acct-Session-Time><Service-Type data_type="0">1</Service-Type><NAS-Identifier data_type="1">ECWIFI</NAS-Identifier><Client-IP-Address data_type="3">192.168.6.72</Client-IP-Address><Client-Vendor data_type="0">0</Client-Vendor><Client-Friendly-Name data_type="1">ArubaAP</Client-Friendly-Name><Proxy-Policy-Name data_type="1">ArubaAP</Proxy-Policy-Name><Packet-Type data_type="0">4</Packet-Type><Reason-Code data_type="0">0</Reason-Code></Event>
@stephenb
Are you advising users to ditch filebeat and winlogbeat in favor of elastic agent?
stephenb
(Stephen Brown)
November 14, 2021, 6:27am
5
You can compare the capabilities here
Agent supports both windows and log input.
But I think perhaps if you want to decode XML you will still need to use filebeat with the xml_decode processor
Unless you can get to that via windows integration.
VamPikmin
(Vam Pikmin)
November 14, 2021, 7:39am
6
Thanks stephenb
I guess I have a lot of reading in front of me.
VamPikmin
(Vam Pikmin)
November 14, 2021, 11:36pm
7
Sorry, I have one more question if that's okay
If these logs already have a timestamp how do I force filebeat to use it instead of the @timestamp field?
message:
"NDDC2","IAS",12/01/2020,09:24:02,11,,"domain.local/DOMAIN/Students BYOD/2023/Name <mark>Surname</mark>",,,,,,,,0,"192.168.131.31","Hi6... (OUTPUT Truncated)
stephenb
(Stephen Brown)
November 14, 2021, 11:46pm
8
You could try the timestamp processor
Or set that in your ingest pipeline if you build one to parse those logs.
VamPikmin
(Vam Pikmin)
November 15, 2021, 7:51am
9
Thanks stephenb
If I'm getting this right, I can't use the timestamp processor unless I create a custom pipeline first or decode the xml, currently all I get from filebeat output is the message, no fields are parsed, so I can't specify a (timestamp) field to parse and delete.
I just have to work out how to create this pipeline or how to decode
Adding the following to the filebeat.yml processors section doesn't seem to work
- decode_xml:
field: message
target_field: "xml"
overwrite_keys: true
ignore_missing: true
ignore_failure: true
Sorry stephenb, is this doable from filebeat.yml or do I need to send to logstash first
U should be able to use Filebeat processors or logstash to parse xml and/or Elasticsearch ingest pipeline to manipulate the data. Elasticsearch doesn't have a xml processor which is why that had to be done in Filebeat or logstash. Everything else could be done in which ever u prefer.
VamPikmin
(Vam Pikmin)
November 17, 2021, 7:18am
11
I was going insane thinking xml decode wasn't working but it turns out the old logs were not in XML format
So before the upgrade the logs were in a ODBC (Legacy) format
and the new (XML) are the DTS Compliant format
Thank you both for your help, so many options to take now
I've also found this for logstash
system
(system)
Closed
December 15, 2021, 9:18am
12
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.