Logstash-2.2.2, windows, IIS log file format

Hello,

when I'm parsing iis log file in UTF-8 format I'm getting [0] "_grokparsefailure" error and When I'm parsing log file using ANSI format there is nothing working Logstash just display message on console " Logstash startup completed". There is almost 1000 files on my server i can't change each file format from ANSI to UTF-8.
Can you please help where I need to change in my config file. I'm also attaching debug file when I'm parsing files on UTF-8 format.
I'm using elastic search on same box and its completely working fine. I'm also able to telnet port 9200 with 127.0.0.1.

2016-03-26T05:40:40.764Z WIN-AK44913P759 2016-03-24 00:16:31 W3SVC20 SANDBOXWEB01 172.x.x.x GET /healthmonitor.axd - 80 - 172.x.x.x HTTP/1.1 - - - www.xyz.net 200 0 0 4698 122 531
{
"message" => "2016-03-24 04:43:02 W3SVC20 ODSANDBOXWEB01 172.x.x.x GET /healthmonitor.axd - 80 - 172.x.x.x HTTP/1.1 - - - www.xyz.net 200 0 0 4698 122 703\r",
"@version" => "1",
"@timestamp" => "2016-03-26T05:42:15.045Z",
"path" => "C:\IISLogs/u_ex160324.log",
"host" => "WIN-AK44913P759",
"type" => "IISLog",
"tags" => [
[0] "_grokparsefailure"
]
}

Below is my logstash conf file configuration
input {
file {
type => "IISLog"
path => "C:\IISLogs/u_ex*.log"
start_position => "beginning"
}
}
filter {
#ignore log comments
if [message] =~ "^#" {
drop {}
}
grok {
match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:iisSite} %{IPORHOST:site} %{WORD:method} %{URIPATH:page} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:username} %{IPORHOST:clienthost} %{NOTSPACE:useragent} %{NOTSPACE:referer} %{NUMBER:response} %{NUMBER:subresponse} %{NUMBER:scstatus} %{NUMBER:bytes:int} %{NUMBER:timetaken:int}"]
}
#Set the Event Timesteamp from the log
date {
match => [ "log_timestamp", "YYYY-MM-dd HH:mm:ss" ]
timezone => "Etc/UCT"
}
useragent {
source=> "useragent"
prefix=> "browser"
}
mutate {
remove_field => [ "log_timestamp"]
}
}

output logs to console and to elasticsearch

output {
stdout {}
elasticsearch {
hosts => ["127.0.0.1:9200"]
}
stdout { codec => rubydebug }
}

Hi Team,

Any update. This is bit urgent.

Thanks
J

If you look at the message and line the components up against the parts of the grok expression they currently match, it is clear that several components have been overlooked and need to be added.

2016-03-24 04:43:02 -> %{TIMESTAMP_ISO8601:log_timestamp}
W3SVC20             -> %{WORD:iisSite}
ODSANDBOXWEB01      -> %{IPORHOST:site}
172.x.x.x           -> %{WORD:method}
GET                 -> %{URIPATH:page}
/healthmonitor.axd  -> %{NOTSPACE:querystring}
-                   -> %{NUMBER:port}
80                  -> %{NOTSPACE:username}
-                   -> %{IPORHOST:clienthost}
172.x.x.x           -> %{NOTSPACE:useragent}
HTTP/1.1            -> %{NOTSPACE:referer}
-                   -> %{NUMBER:response}
-                   -> %{NUMBER:subresponse}
-                   -> %{NUMBER:scstatus}
www.xyz.net         -> %{NUMBER:bytes:int}
200                 -> %{NUMBER:timetaken:int}
0                   -> ?
0                   -> ?
4698                -> ?
122                 -> ?
703\r               -> ?

You may also need to use a mutate filter to remove the training '\r' unless you account for that in the grok expression.

Hi Christian,
I've applied same above grock expression but still I'm getting same message [0] "_grokparsefailure" error.

Here is my log file
#Fields: date time s-sitename s-computername s-ip cs-method cs-uri-stem cs-uri-query s-port cs-username c-ip cs-version cs(User-Agent) cs(Cookie) cs(Referer) cs-host sc-status sc-substatus sc-win32-status sc-bytes cs-bytes time-taken
2016-03-25 00:00:01 W3SVC20 SANDBOXWEB01 172.30.34.167 GET /healthmonitor.axd - 80 - 172.30.34.4 HTTP/1.1 - - - www.uk.sandbox.orderdynamics.net 200 0 0 4375 122 31
2016-03-25 00:00:01 W3SVC20 SANDBOXWEB01 172.30.34.167 GET /healthmonitor.axd - 80 - 172.30.34.4 HTTP/1.1 - - - www.-uk.sandbox.orderdynamics.net 200 0 0 4374 122 15

Below is my logstash.conf
input {
file {
type => "IISLog"
path => "C:\IISLogs\u_ex160324.log"
start_position => "beginning"
}
}

filter {

#ignore log comments
if [message] =~ "^#" {
drop {}
}

grok {
match => ["message",
"%{TIMESTAMP_ISO8601:log_timestamp}
%{WORD:iisSite}
%{HOSTNAME}
%{IPORHOST:site}
%{WORD:method}
%{URIPATH:page}
%{NOTSPACE:querystring}
%{NUMBER:port}
%{NOTSPACE:username}
%{IPORHOST:clienthost}
HTTP/%{NUMBER:httpversion}
%{NOTSPACE:referer}
%{NOTSPACE:querystring}
%{NOTSPACE:querystring}
%{NUMBER:response}
%{NUMBER:bytes:int}
%{NUMBER:bytes:int}
%{NUMBER:bytes:int}
%{NUMBER:bytes:int}
%{NUMBER:bytes:int}
%{NUMBER:bytes:int}"]
}

useragent {
	source=> "useragent"
	prefix=> "browser"
}

}

output logs to console and to elasticsearch

output {
stdout {}
elasticsearch {
hosts => ["127.0.0.1:9200"]
}
stdout { codec => rubydebug }
}

Hi Team,

Any update for above comment.

Thanks
J

Hi Christian,

If you shade some light. It will be help full.

Thanks
J

Don't break the grok pattern up into multiple lines. I just did choose to display it that way to show how fields did not match up. You are also capturing the bytes and querystring fields multiple times. I suspect this should be different fields. The general recommendation when building grok expressions is to start from the beginning and add field by field.

You also do not seem to be capturing the useragent field that you are trying to use in the user agent filter.