Do we have SIEM dashboards and detection anomaly for DHCP logs?

Hi Team,

I have on boarded DHCP logs into ELK and looking for SIEM detection anomaly for DHCP logs. I didn't seen anything dashboard related for DHCP logs in SIEM

DHCP logs example :-1
Fields:
ID,Date,Time,Description,IP Address,Host Name,MAC Address,User Name, TransactionID, QResult,Probationtime, CorrelationID,Dhcid,VendorClass(Hex),VendorClass(ASCII),UserClass(Hex),UserClass(ASCII),RelayAgentInformation,DnsRegError.
logs :-
24,11/25/16,00:00:36,Database Cleanup Begin,,,,,0,6,,,,,,,,,0
30,11/25/16,00:00:36,DNS Update Request,10.115.0.70,HOSTNAME,,,0,6,,,,,,,,,0

I have converted all fields according ECS format in logstash
"%{DATA:id},%{DATE_US:date},(?%{HOUR}:%{MINUTE}:%{SECOND}),%{DATA:description},%{IPV4:[source][ip]},%{DATA:[source][hostname]},%{DATA:mac},%{DATA:[user][name]},%{INT:[transaction][id]},%{INT:[q][result]},%{DATA:[probation][time]},%{DATA:[correlation][id]},%{DATA:dhcid},%{DATA:[vendorclass][hex]},%{DATA:[vendorClass][ascii]},%{DATA:[userclass][hex]},%{DATA:[userclass][ascii]},%{DATA:[relayagen][information]},%{INT:[dns][reg][error]}"}

Can you please suggest me is DHCP logs will work for SIEM module?

Hi Sundar,

Given some caveats, yes, your DHCP logs will work for SIEM anomalies. SIEM anomalies are built on top of Machine Learning jobs in the Elastic stack, which will create anomalies based on data in specific indexes that are specific to certain ML jobs. A list of available jobs can be found here.

In order to get your DHCP data analyzed for anomalies with SIEM, you will need to determine which jobs you want running, ensure you have the necessary ECS fields mapped in order for those jobs to run, and have the index the DHCP data is in be compatible with the index pattern in the according ML jobs. The ML jobs were built to analyze data indexed by Beats, so the index pattern for the ML jobs will be mostly representative of Beats index patterns.

For example if you wanted to run the job packetbeat_rare_dns_question you would ensure the fields

  • host.name
  • dns.question.name
  • dns.question.type
  • event.dataset
  • agent.type

are mapped properly and the index which the data resides in matches packetbeat-* index pattern.

If you want to see which index patterns are needed for which jobs, this is an example of the manifest that you could check to see which jobs require which indexes. The docs to the jobs should provide information on which ECS fields you need.

Also, I do find it helpful to look at the "inspect" button [1] on the tables in the SIEM as that can be helpful in finding what ECS fields I should have mapped so the data I want to see in the tables is visible.

[1]:

1 Like

Thanks Devin for quick response. I agree with you for DNS query logs. looks like no predefined jobs for DHCP logs. I can able to get all hosts table data for dhcp. Apart from that no SIEM dashboards. Is nay correlation between DHCP and other logs? Do we need to follow any mandatory field for DHCP logs?
DHCP logs are providing log format is
Fields:
ID,Date,Time,Description,IP Address,Host Name,MAC Address,User Name, TransactionID, QResult,Probationtime, CorrelationID,Dhcid,VendorClass(Hex),VendorClass(ASCII),UserClass(Hex),UserClass(ASCII),RelayAgentInformation,DnsRegError

log :-
30,11/25/16,00:00:36,DNS Update Request,10.115.0.70,HOSTNAME,,,0,6,,,,,,,,,0

Below one is our json data and following ECS format:-

{
  "_index": "test-dhcp-2020.05.06",
  "_type": "_doc",
  "_id": "6fJN6HEBoMLvblKuJ75v",
  "_version": 1,
  "_score": null,
  "_source": {
    "vendorclass": {
      "hex": "0x4D53465420352E30"
    },
    "@version": "1",
    "_@timestamp": 1588740298.96666,
    "type": "test-windows-dhcp",
    "vendorClass": {
      "ascii": "MSFT 5.0"
    },
    "transaction": {
      "id": "3393005611"
    },
    "date_field": "05/06/20 08:44:58",
    "@timestamp": "2020-05-06T04:44:58.000Z",
    "time": "08:44:58",
    "dns": {
      "reg": {
        "error": "0"
      }
    },
    "event": "A lease was renewed by a client",
    "source": {
      "ip": "10.115.1.16",
      "hostname": "host"
    },
    "tailed_path": "C:\\Windows\\System32\\dhcp\\DhcpSrvLog-Wed.log",
    "id": "11",
    "mac": "00505691E822",
    "q": {
      "result": "0"
    },
    "date": "05/06/20",
    "message": "11,05/06/20,08:44:58,Renew,10.115.1.16,test,00505691E822,,3393005611,0,,,,0x4D53465420352E30,MSFT 5.0,,,,0",
    "description": "Renew"
  },
  "fields": {
    "@timestamp": [
      "2020-05-06T04:44:58.000Z"
    ]
  },
  "sort": [
    1588740298000
  ]
}

I'm curious as to what kinds of anomalies you think would be useful to detect on this data - because obviously you could just create your own custom ML jobs! :slight_smile: