Filebeat for Windows DNS log


Hi everyone,
I'm writing this topic to have some help, advices, tips about how to send correctly the dns logs on a windows domain controller. I'm a beginner in ELK stack and I have to ship log of DNS role (dhcp log later).

My configuration :

  • ELK Stack version 5.6.7 on a CentOS Linux release 7.4.1708
    logstash 5.6.7
  • Windows 2K12 server with dns role

I would like to send the dns log file to ELK.

  1. I have done a test with filebeat. Output is Elasticsearch. It works but I have no filter, so the message field contains all informations and I can't use them properly.

  2. Same test with filebeat but output is logstach. I have understood I have to parse informations but I am a little bit lost.
    How I can skip the 30 first lines ?
    I have found this pattern

but I don't understand this part :
(?:Q|R|U) ?(Q|R|U)

is that means the pattern works if the data is a Q, a R or a U ? But no fieds is added ?

  1. I have done a test with packetbeat and output is elasticsearch. It works properly, I have a lot of informations. But how I can filter or skip the columns I don't want. Is it possible to do this in the yml file ?
    Or the only solution is to skip the fields when I choose the logstach column ?

  2. Is that better to use a powershell script to parse the log file, and to do an export in json or csv and analyse this export with filebeat ? If I do this, how I can resume the analyse to avoid to loose informations ? Filebeat can do this ?

  3. Do I have other possibilites ? Is the version 6.2 offers other tools ?

  4. If someone has already done the dns log shipping of a Windows Server, how have you done ? What solution have you choose ?

Thanks for your help.


(Andrew Kroh) #2

The simplest way to get very detailed DNS traffic logs is to operate Packetbeat on the DNS server. If the data is too detailed you can discard fields in the Packetbeat configuration by using a drop_fields processor.

If you want to extract data from your Windows DNS logs then you'll need to use Filebeat -> Logstash. Then grok or dissect the logs to extract data.

That's not a commonly used technique. Let Filebeat ship the raw data to Logstash as soon as it is available and leave the parsing to LS.

This is a regular expression. Grok sits on top of regular expressions, so any regular expressions are valid in grok as well. (?:Q|R|U) is a non-capturing group that matches Q OR R OR U. Next is space followed by ? which means the space is optional. Finally (Q|R|U) is a capturing group that again matches Q OR R OR U.

I often use the to test grok patterns.


Thank you very much for your answers.

I have uninstalled packetbeat, I have too much informations and I don't have the informations like I have found in the dns logs.

I'm using filebeat, and I have tried to send only the lines I want with the "include_lines" function
and it's working :slight_smile:

I choose to send lines with "packet" and now I have to parse informations I want to keep with logstash.

Do I have to restart logstash each time I do a modification in the filter ?

(Andrew Kroh) #4

I'm curious what information is contained in the DNS server logs that isn't reported by Packetbeat?

Unless you have enabled config reloading you do need to restart after each change.

(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.