Filebeat & Windows DHCP Logging

I'm trying to locate some examples that are specific to elastic cloud when using filebeat to capture Microsoft DHCP logs. All the examples I find online aren't really matching what I'm seeing in 7.12.0 example files, or they are assuming a lot IMHO about the knowledge of the person reading the blog post.

Could someone be so kind as to post an example that would take to get Windows DHCP logs into an Elastic Cloud deployment? To my knowledge, I'm not using Logstash at all, everything I've done so far has been install beats, configure the yml file, enable a couple of modules, start the service after running the initial *beat.cmd setup -e command.

Hello @billkindle, thanks for reaching out! Seems like you have gotten all the steps correctly already, are you seeing any issues?

Let's say you install the filebeat agent on the server which has the DHCP.logs, either the Windows machine or another machine in which the logs are transferred to, after that it would be something like this:

  1. Install filebeat on the host you want to use.
  2. Configure filebeat.yml to point to the cloud cluster you want, the credentials you use by default is the ones you use to login to kibana. For example if you click on Add data in kibana:

    and click on any of the examples, does not have to be exactly the one you are adding, as you are just looking for the credentials, which should be on "Step 2, Edit the configuration".

There you will see a cloud.id and cloud.auth fields you need to fill in on your filebeat.yml configuration.

Now that you have the connection between filebeat and the cloud configured, you need to enable the module, all modules have pretty much the same workflow with enabling the module, open up the module configuration, and at the end run your filebeat setup command, example:

filebeat modules enable microsoft

After that, each module you enable will have a configuration file, default location would be:

C:\Program Files\Filebeat\modules.d\microsoft.yml

The yml file name should be the same as the module you enabled.

Now each modules configuration file is a bit different, as all integrations have different requirements, first any microsoft module you are not interested in, you can ensure that
enable: false is set.

For the DHCP module you then have 2 options, you can either give it a path to your DHCP logfiles, or you can configure a syslog listening port if the data is coming through syslog.

After this is configured and filebeat setup has been run, you can start the filebeat and your logs should be available to you, easiest way to check this, is to for example go to Stack Management on your left sidebar menu and click Index Management, that will show you if data is coming in.

If you want to find out exactly what is sending you data, you can use for example the Discovery page, which is almost at the top of the left sidebar menu.

Make sure filebeat-* is chosen at the top left, and start typing a filter/search at the top, for example event.module:

I don't have any data in this demo environment, but at this point it should give you an option of possible event.modules that already has data sent to Elasticsearch.

You could try to complete it and hit enter, like event.module: microsoft, which means that any data you would see in the list then, all comes from the microsoft module you enabled.

Hopefully this gives some directions, I was a bit unsure if any of the steps was missed, is there any specific errors or so you might be seeing?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.