Hello,
I am trying to ship windows server DNS and DHCP logs to my testing ELK environment (Ubuntu). I can see Filebeat is sending the logs but for some reason logstash is only receiving dns logs, not dhcp. I am using latest version of ELK stack.
Filebeat.yml
#==== Filebeat inputs =========
filebeat.inputs:
- type: log
enabled: true
paths:
- 'C:\dhcplogs\DhcpSrvLog-*.log'
include_lines: ["^[0-9]"]
document_type: dhcp
fields_under_root: true
close_removed : false
clean_removed : false
ignore_older: 47h
clean_inactive: 48h
fields:
type: dhcp
fields_under_root: true
- type: log
enabled: true
paths:
- 'C:\dnslogs\dns*.log'
include_lines: ["^[0-9]"]
document_type: dns
fields_under_root: true
close_removed : false
clean_removed : false
ignore_older: 47h
clean_inactive: 48h
fields:
type: dns
fields_under_root: true
#===Kibana====================
setup.kibana:
host: "X.X.X.X:5601"
#-------- Logstash output -----------------
output.logstash:
hosts: ["X.X.X.X:5044"]
In the filebeat log file, I can see filebeat is sending (or atleast tailing both the logs) :
2020-03-27T06:21:05.906-0700 INFO registrar/registrar.go:152 States Loaded from registrar: 10
2020-03-27T06:21:05.907-0700 WARN beater/filebeat.go:368 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2020-03-27T06:21:05.907-0700 INFO crawler/crawler.go:72 Loading Inputs: 2
2020-03-27T06:21:05.950-0700 INFO log/input.go:152 Configured paths: [C:\dhcplogs\DhcpSrvLog-*.log]
2020-03-27T06:21:05.950-0700 INFO input/input.go:114 Starting input of type: log; ID: 9252333259376502739
2020-03-27T06:21:05.971-0700 INFO log/harvester.go:297 Harvester started for file: C:\dhcplogs\DhcpSrvLog-Fri.log
2020-03-27T06:21:06.058-0700 INFO log/input.go:152 Configured paths: [C:\dnslogs\dns*.log]
2020-03-27T06:21:06.058-0700 INFO input/input.go:114 Starting input of type: log; ID: 15896225400827294810
2020-03-27T06:21:06.059-0700 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 2
2020-03-27T06:21:06.059-0700 INFO cfgfile/reload.go:171 Config reloader started
2020-03-27T06:21:06.060-0700 INFO cfgfile/reload.go:226 Loading of config files completed.
2020-03-27T06:21:06.080-0700 INFO log/harvester.go:297 Harvester started for file: C:\dnslogs\dns2020-03-26T211811Z.log
2020-03-27T06:21:06.983-0700 INFO pipeline/output.go:95 Connecting to backoff(async(tcp://X.X.X.X:5044))
2020-03-27T06:21:07.001-0700 INFO pipeline/output.go:105 Connection to backoff(async(tcp://X.X.X.X:5044)) established
2020-03-27T06:21:33.649-0700 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":1078,"time":{"ms":375}},"total":{"ticks":2406,"time":{"ms":672},"value":2406},"user":{"ticks":1328,"time":{"ms":297}}},"handles":{"open":256},"info":{"ephemeral_id":"63cee4cc-83f1-4326-a606-07a27cb5cb0d","uptime":{"ms":90150}},"memstats":{"gc_next":14871264,"memory_alloc":11575344,"memory_total":120022320,"rss":3751936},"runtime":{"goroutines":47}},"filebeat":{"events":{"added":3885,"done":3885},"harvester":{"files":{"5e2fdf7a-745f-4e10-a135-6c89cb986b02":{"last_event_published_time":"2020-03-27T06:21:32.034Z","last_event_timestamp":"2020-03-27T06:21:32.034Z","name":"C:\\dhcplogs\\DhcpSrvLog-Fri.log","read_offset":1472824,"size":1471019,"start_time":"2020-03-27T06:21:05.953Z"},"db3c0abb-56e2-49dd-ac7f-197ff93aa4d6":{"last_event_published_time":"2020-03-27T06:21:31.236Z","last_event_timestamp":"2020-03-27T06:21:31.236Z","name":"C:\\dnslogs\\dns2020-03-26T211811Z.log","read_offset":94100204,"size":93977245,"start_time":"2020-03-27T06:21:06.075Z"}},"open_files":2,"running":2,"started":2}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"events":{"acked":1958,"batches":5,"total":1958},"read":{"bytes":30},"write":{"bytes":129692}},"pipeline":{"clients":2,"events":{"active":0,"filtered":1927,"published":1958,"retry":1136,"total":3885},"queue":{"acked":1958}}},"registrar":{"states":{"current":10,"update":3885},"writes":{"success":16,"total":16}}}}}
2020-03-27T06:22:03.647-0700 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":1296,"time":{"ms":218}},"total":{"ticks":2702,"time":{"ms":296},"value":2702},"user":{"ticks":1406,"time":{"ms":78}}},"handles":{"open":256},"info":{"ephemeral_id":"63cee4cc-83f1-4326-a606-07a27cb5cb0d","uptime":{"ms":120148}},"memstats":{"gc_next":13915552,"memory_alloc":10364368,"memory_total":133421552,"rss":282624},"runtime":{"goroutines":47}},"filebeat":{"events":{"added":1058,"done":1058},"harvester":{"files":{"5e2fdf7a-745f-4e10-a135-6c89cb986b02":{"last_event_published_time":"2020-03-27T06:21:59.127Z","last_event_timestamp":"2020-03-27T06:21:59.127Z","read_offset":1924,"size":1924},"db3c0abb-56e2-49dd-ac7f-197ff93aa4d6":{"last_event_published_time":"2020-03-27T06:21:56.286Z","last_event_timestamp":"2020-03-27T06:21:56.286Z","read_offset":79912,"size":122959}},"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":537,"batches":5,"total":537},"read":{"bytes":30},"write":{"bytes":38849}},"pipeline":{"clients":2,"events":{"active":0,"filtered":521,"published":537,"total":1058},"queue":{"acked":537}}},"registrar":{"states":{"current":10,"update":1058},"writes":{"success":5,"total":5}}}}}
But I can only see DNS logs in Kibana. Here is my filebeat.conf file:
input {
beats {
port => 5044
}
}
filter
{
if [type] == "dhcp"
{
dissect {
mapping => {
"message" => "%{ID},%{Date},%{Time},%{Description},%{IP_Address},%{Host_Name},%{MAC_Address},%{User_Name},%{TransactionID},%{QResult},%{Probationtime},%{CorrelationID},%{Dhcid},%{VendorClass_hex},%{VendorClass_ascii},%{UserClass_hex},%{UserClass_ascii},%{RelayAgentInformation},%{DnsRegError}"
}
}
mutate
{
add_field => { "log_timestamp" => "%{Date}-%{Time}" }
}
date {
match => [ "log_timestamp", "MM/dd/YY-HH:mm:ss" ]
timezone => "America/New_York"
}
if "_dateparsefailure" not in [tags]
{
mutate
{
remove_field=> ['Date', 'Time', 'log_timestamp', 'message']
}
}
}
else if [type] == "dns"
{
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => {
"message" => "%{MS_DNS_DATE:date}\s+%{TIME:time}\s+(?:AM|am|PM|pm)\s+%{DATA:thread_id}\s+%{WORD:dns_type}\s+%{BASE16NUM:packet_id}\s+%{WORD:dns_protocol}\s+%{WORD:dns_direction}\s+%{IP:dns_ip}\s+%{BASE16NUM:xid}\s+%{DATA:response}\s+%{WORD:dns_query_type}\s+\[%{BASE16NUM:hex_flags}\s+%{WORD:rcode_name}\s+%{WORD:Flag}\]\s+%{WORD:query_type_name}\s+%{GREEDYDATA:dns_domain}"
}
}
mutate {
gsub => [
# Remove leading (n)
"dns_domain", "^\(\d+\)", "",
# Remove trailing (n)
"dns_domain", "\(\d+\)$", "",
# Replace inner (n)
"dns_domain", "\(\d+\)", "."
]
}
}
}
output {
if [type] == "dhcp"
{
elasticsearch {
hosts => ["http://localhost:9200"]
index => "dhcp-%{+YYYY.MM.dd}"
}
}
else if [type] == "dns"
{
elasticsearch {
hosts => ["http://localhost:9200"]
index => "dns-%{+YYYY.MM.dd}"
}
}
}
where MS_DNS_Date is a custom pattern:
%{MONTHNUM}/%{MONTHDAY}/%{YEAR}
I even tried using grok filter instead of dissect (Tested in grok debugger first) but with same result.
if [type] == "dhcp"
{
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => {
"message" => "%{DATA:id},%{MS_DNS_DATE:date},%{TIME:time},%{DATA:Description},%{IPV4:ip},%{DATA:Hostname},%{DATA:MAC_Address},%{DATA:Username},%{GREEDYDATA:Remaining_Log}"
}
}
}
I am not sure what to do next. There is nothing in logstash.log file as well. No error nothing. Can anyone help me?