Please help. I'm not getting how to input IIS logs

Hello,

I installed ELK a couple weeks ago and I'm pulling what I have left of my haIr out. I'm having trouble with getting IIS log data to show up. I have filebeat installed on the clients and am using IIS Advanced logging. I have the ELK stack on a Windows box and need the following IIS log fields to produce data:

$date, $time, $s-ip, $cs-uri-stem, $cs-uri-query, $s-port, $cs-username, $c-ip, $X-Forwarded-For, $csUser-Agent, $cs-Referer, $sc-status, $sc-substatus, $sc-win32-status, $time-taken

I just can't grasp it. Can someone please help? And apologies for being an idiot :frowning:

Is the problem

  • Filebeat isn't shipping the log file data to Logstash or
  • Logstash receives the data but doesn't parse it correctly?

Logstash doesn't parse it correctly

What configuration do you have so far? And if you're not using the csv filter for this, why not?

As I mentioned, I'm not understanding what needs to be done...This is what I have:

`input {
beats {
port => 5044
type => 'iis'
}
}

First filter

filter {
#ignore log comments
if [message] =~ "^#" {
drop {}
}

grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPV4:ServerIP} %{NOTSPACE:stem} %{DATA:query} %{NUMBER:serverPort} %{DATA:username} %{IPV4:clientIP} %{QUOTEDSTRING:ForwardedIP} %{QUOTEDSTRING:useragent} %{QUOTEDSTRING:referer} %{NUMBER:httpStatusCode} %{NUMBER:scSubstatus} %{NUMBER:scwin32status} %{NUMBER:timeTakenMS}"]
tag_on_failure => [ ]
}
date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss" ]
locale => "en"
}
}

Second filter

filter {
if "_grokparsefailure" in [tags] {

} else {
# on success remove the message field to save space
mutate {
  remove_field => ["message", "timestamp"]
}

}
}

output {
elasticsearch { hosts => ["127.0.0.1:9200"] }
stdout { codec => rubydebug }
}`

Can you give an example line from your log? The fields you posted previously indicated that there are commas between the fields but that's contradicted by your grok filter.

Sure. Heres a line from the log:

2016-06-28 20:30:37.377 192.168.20.21 /Ping.aspx - 443 - 192.168.20.252 "10.0.1.26" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36" "https://rmsnetqa.tsd-inc.com/Default.aspx" 200 0 0 5756

Your grok filter works fine:

$ cat data 
2016-06-28 20:30:37.377 192.168.20.21 /Ping.aspx - 443 - 192.168.20.252 "10.0.1.26" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36" "https://rmsnetqa.tsd-inc.com/Default.aspx" 200 0 0 5756
$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{IPV4:ServerIP} %{NOTSPACE:stem} %{DATA:query} %{NUMBER:serverPort} %{DATA:username} %{IPV4:clientIP} %{QUOTEDSTRING:ForwardedIP} %{QUOTEDSTRING:useragent} %{QUOTEDSTRING:referer} %{NUMBER:httpStatusCode} %{NUMBER:scSubstatus} %{NUMBER:scwin32status} %{NUMBER:timeTakenMS}"]
  }
}
$ /opt/logstash/bin/logstash -f test.config < data
Settings: Default pipeline workers: 2
Pipeline main started
{
           "message" => "2016-06-28 20:30:37.377 192.168.20.21 /Ping.aspx - 443 - 192.168.20.252 \"10.0.1.26\" \"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36\" \"https://rmsnetqa.tsd-inc.com/Default.aspx\" 200 0 0 5756",
          "@version" => "1",
        "@timestamp" => "2016-07-27T08:59:13.933Z",
              "host" => "hallonet",
     "log_timestamp" => "2016-06-28 20:30:37.377",
          "ServerIP" => "192.168.20.21",
              "stem" => "/Ping.aspx",
             "query" => "-",
        "serverPort" => "443",
          "username" => "-",
          "clientIP" => "192.168.20.252",
       "ForwardedIP" => "\"10.0.1.26\"",
         "useragent" => "\"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 Safari/537.36\"",
           "referer" => "\"https://rmsnetqa.tsd-inc.com/Default.aspx\"",
    "httpStatusCode" => "200",
       "scSubstatus" => "0",
     "scwin32status" => "0",
       "timeTakenMS" => "5756"
}
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

OK. I'll research further. Thanks

If you don't want to mess with grok filter, I wrote a guide about how to process IIS log http://www.secureict.info/2016/07/elastic-stack-process-iis-logs.html