Filebeat Windows Config Help (filebeat.yml)


#1

Trying to capture IIS logs in close to real time, but Kibana seems to only show old logs (the day before and older). Can anyone point me in the right direction to getting close to real-time logs displayed in Kibana? OS is Windows Server 2008

Config:

  prospectors:
         - C:\inetpub\logs\LogFiles\*\*
         - D:\logfiles\*\*
encoding: utf-8
exclude_lines: ["^#"]
exclude_files: [".zip"]
ignore_older: 24h
registry_file: "C:/ProgramData/filebeat/registry"

logstash:
hosts: ["logstashl:3525"]
compression_level: 0

logging:
files:
rotateeverybytes: 10485760 # = 10MB`

And the GROK filter in the logstash config:

if [type] == "IIS" {
    if [message] =~ "^#" {
      drop {}
    }
    grok {
      match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{DATA:service-name} %{DATA:hostname} %{IPV4:site} %{DATA:method} %{URIPATH:page} (?:%{NOTSPACE:querystring}|-) %{INT:port} %{NOTSPACE:username} %{IPV4:clienthost} %{NOTSPACE:useragent} %{DATA:referer} %{INT:response} %{INT:subresponse} %{INT:scstatus} %{NUMBER:sc-bytes} %{NUMBER:cs-bytes} %{NUMBER:iis-time-taken}"]
    }
	date {
    match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
    }

  kv {
    source => "querystring"
    field_split => "&"
  }

  urldecode  {
    field => "query"
  }

  if [sc-bytes] {
    ruby {
      code => "event['kilobytesSent'] = event['sc-bytes'].to_i / 1024.0"
    }
  }

  if [cs-bytes] {
    ruby {
      code => "event['kilobytesReceived'] = event['cs-bytes'].to_i / 1024.0"
    }
  }

    mutate {
    gsub => [
      "query","\+"," "
    ]
      remove_field => ["querystring", "message"]
      convert => ["sc-bytes", "integer"]
      convert => ["cs-bytes", "integer"]
      convert => ["iis-time-taken", "integer"]
    }

  useragent {
    source=> "useragent"
    prefix=> "browser"
  }

  }

Filebeat showing logs older than 24 hrs
(ruflin) #2

The filebeat config looks quite off above with the indentation. Is that just a copy / paste issue?

So today you see in Kibana the logs from yesterday? Means over the long term, all logs are available? Does Kibana, Logstash, Filebeat and the server itself have the same timezone?


#3

Hi Ruflin - thanks for responding!

Yes, the copy/paste issue made the formatting appear strange. I just checked Kibana today, and the most recent filebeat IIS logs are from midnight, nothing newer.

Yes, the time zone should all be UTC


(ruflin) #4

As you process the timestamp, can you check if IIS perhaps uses the wrong time one? Can you post a log line here? Also can you very that all are on UTC?


#5

All systems, including IIS are configured to use UTC. Here are sample lines from IIS logs:

#Software: Microsoft Internet Information Services 7.5
#Version: 1.0
#Date: 2016-06-15 00:00:02
#Fields: date time s-sitename s-computername s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs(User-Agent) cs(Referer) sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2016-06-15 00:00:02 W3SVC1 SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 1
2016-06-15 00:00:06 W3SVC1 SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 1
2016-06-15 00:00:12 W3SVC1 SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 1
2016-06-15 00:00:16 W3SVC1 SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 2
2016-06-15 00:00:22 W3SVC1SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 2
2016-06-15 00:00:26 W3SVC1 SERVER 10.100.0.100 GET / - 443 - 10.0.63.1 - - 401 0 0 2293 7 1

The IIS logs roll over to a new file each day.


(ruflin) #6

Could you check two more things?

  • Run filebeat with -e -d "publish" and check if filebeat pushes the most recent events
  • Run logstash with stdout or file output and check if it writes the most recent events correctly?

It would also help to first use a minimal LS config and see if all lines show up as expected and then dig deeper into the LS config to see if there is a potential problem.


#7

With today being 6/17 and executing filebeat with -e -d "publish", the first results displayed were from 6/14.

2016/06/17 15:10:35.307882 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/06/17 15:10:35.307882 logstash.go:106: INFO Max Retries set to: 3
2016/06/17 15:10:35.316883 outputs.go:126: INFO Activated logstash as output plugin.
2016/06/17 15:10:35.316883 publish.go:232: DBG  Create output worker
2016/06/17 15:10:35.316883 publish.go:274: DBG  No output is defined to store the topology. The server fields might not
be filled.
2016/06/17 15:10:35.317883 publish.go:288: INFO Publisher name: SERVER
2016/06/17 15:10:35.321884 async.go:78: INFO Flush Interval set to: 1s
2016/06/17 15:10:35.322884 async.go:84: INFO Max Bulk Size set to: 2048
2016/06/17 15:10:35.322884 async.go:92: DBG  create bulk processing worker (interval=1s, bulk size=2048)
2016/06/17 15:10:35.322884 beat.go:147: INFO Init Beat: filebeat; Version: 1.2.3
2016/06/17 15:10:35.323884 beat.go:173: INFO filebeat sucessfully setup. Start running.
2016/06/17 15:10:35.323884 registrar.go:68: INFO Registry file set to: C:\ProgramData\filebeat\registry
2016/06/17 15:10:35.325884 registrar.go:80: INFO Loading registrar data from C:\ProgramData\filebeat\registry
2016/06/17 15:10:35.333885 prospector.go:133: INFO Set ignore_older duration to 24h0m0s
2016/06/17 15:10:35.333885 prospector.go:133: INFO Set close_older duration to 1h0m0s
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set scan_frequency duration to 10s
2016/06/17 15:10:35.334885 prospector.go:90: INFO Invalid input type set:
2016/06/17 15:10:35.334885 prospector.go:93: INFO Input type set to: log
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set backoff duration to 1s
2016/06/17 15:10:35.334885 prospector.go:133: INFO Set max_backoff duration to 10s
2016/06/17 15:10:35.334885 prospector.go:113: INFO force_close_file is disabled
2016/06/17 15:10:35.334885 prospector.go:143: INFO Starting prospector of type: log
2016/06/17 15:10:35.334885 spooler.go:77: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2016/06/17 15:10:35.658917 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160614.log.1

2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160615.log.1

2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160613.log.1

2016/06/17 15:10:35.659918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160616.log.1

2016/06/17 15:10:35.660918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160612.log.1

2016/06/17 15:10:35.661918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160616.log.1

2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160617.log
2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160616.log.1

2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160616.log.1

2016/06/17 15:10:35.662918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC4\u_ex160614.log.1

2016/06/17 15:10:35.664918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160617.log
2016/06/17 15:10:35.664918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160613.log.1

2016/06/17 15:10:35.665918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160615.log.1

2016/06/17 15:10:35.666918 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160612.log.1

2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC3\u_ex160617.log
2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex160614.log.1

2016/06/17 15:10:35.671919 log.go:113: INFO Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC2\u_ex160612.log.1

#8

And not sure if this could be causing a problem, but logstash log reports this:

:expected_charset=>"UTF-8", :level=>:warn}

Strangely enough, IIS is configured for UTC-8, and filebeat encoding is also specified as UTC-8


(ruflin) #9

I assume by UTF you mean UTC? It looks like you posted only the end of the error message. Could you post the full error?


#10
{:timestamp=>"2016-06-21T08:17:39.851000-0600", :message=>"Received an event that has a different character encoding than you configured.", :text=>"2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 SERVER 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304}2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304,\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 server 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\"}2W\\u0000\\u0000\\u0000\\u00012J\\u0000\\u0000\\u0000\\u0001\\u0000\\u0000\\u0001\\x9E{\\\"@metadata\\\":{\\\"type\\\":\\\"log\\\",\\\"beat\\\":\\\"filebeat\\\"},\\\"input_type\\\":\\\"log\\\",\\\"message\\\":\\\"2016-06-19 00:00:00 W3SVC3 server 10.0.0.1 POST /API/Check - 80 - 10.0.0.2 Job - 200 0 0 558 185 368\\\",\\\"type\\\":\\\"log\\\",\\\"count\\\":1,\\\"fields\\\":null,\\\"beat\\\":{\\\"hostname\\\":\\\"server\\\",\\\"name\\\":\\\"server\\\"},\\\"@timestamp\\\":\\\"2016-06-21T13:01:01.449Z\\\",\\\"source\\\":\\\"C:\\\\\\\\inetpub\\\\\\\\logs\\\\\\\\LogFiles\\\\\\\\W3SVC3\\\\\\\\u_ex160619.log\\\",\\\"offset\\\":304}", :expected_charset=>"UTF-8", :level=>:warn}

(ruflin) #11

There seems definitively something wrong with the encoding. What is the encoding of the files you are reading? I have the suspicion that these files are not in utf-8. So in case these are not in utf-8, you have to change the encoding accordingly in filebeat.


#12

hi Ruflin,

I spent a lot of time troubleshooting this yesterday and believe I finally got it working.

  1. Confirmed that the IIS logs were configured for UTF-8
  2. Confirmed filebeat configuration was also set to UTF-8 encoding
  3. Confirmed parsing looked good in filebeat debug logs

And here is what I changed that seems to have corrected this:

Logstash config was set to:

input {
tcp { port => 3516 type => IIS }
}

if [type] == "IIS" {
if [message] =~ "^#" {
drop {}
}
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{DATA:service-name} %{DATA:hostname} %{IPV4:site} %{DATA:method} %{URIPATH:page} %{NOTSPACE:querystring} %{INT:port} %{NOTSPACE:username} %{IPV4:clienthost} %{NOTSPACE:useragent} %{DATA:referer} %{INT:response} %{INT:subresponse} %{INT:scstatus} %{NUMBER:sc-bytes} %{NUMBER:cs-bytes} %{NUMBER:iis-time-taken}"]
}
date {
match => ["log_timestamp", "YYYY-MM-dd HH:mm:ss"]
}

mutate {
  convert => ["sc-bytes", "integer"]
  convert => ["cs-bytes", "integer"]
  convert => ["iis-time-taken", "integer"]
}

}

And was changed to:

input {
beats { port => 3516 type => IIS }
}

So this begs the question - why does TCP input affect the encoding differently than BEATS input?


(Andrew Kroh) #13

Filebeat is expected to communicate to the logstash beats input and not the TCP input. Filebeat communicates to Logstash over the lumberjack protocol and this is what the beats input provides. The data sent by Beats is JSON and the beats input knows this and decodes the JSON.

Filebeat is where you need to set the correct encoding type for your log file.

In your Logstash config, setting type to IIS for the beats input will not have an affect because type is already set by Filebeat (see Logstash's docs for type). To set the type to IIS in Filebeat use the document_type config setting.


#14

Thanks Andrew - marked as answer. In logstash I have type set to IIS because I have other logs coming in with different grok expressions. But you are correct, in filebeat I do have document_type set to IIS.


(system) #15

This topic was automatically closed after 21 days. New replies are no longer allowed.