Grok fields are not visible in kibana

I have used filebeat to capture IIS logs. and I have used following grok to filter IIS logs.

match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:serverip} %{WORD:verb} %{NOTSPACE:request} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:auth} %{IPORHOST:clientip} %{NOTSPACE:browser}/%{NOTSPACE:agent} %{NOTSPACE:referrer} %{NUMBER:response} %{NUMBER:sub_response} %{NUMBER:sc_status} %{NUMBER:responsetime}"]

but in kibana I couldn't received the filtered data according to the grok filter.

I tried refresh the index

remove and add the index again but no luck :confused:

please help

my logstash.conf is as follows

input {
beats {
port => 5000
}
}

First filter

filter {
#ignore log comments
if [message] =~ "^#" {
drop {}
}

grok {
#patterns_dir => "./patterns"

match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{IPORHOST:serverip} %{WORD:verb} %{NOTSPACE:request} %{NOTSPACE:querystring} %{NUMBER:port} %{NOTSPACE:auth} %{IPORHOST:clientip} %{NOTSPACE:browser}/%{NOTSPACE:agent} %{NOTSPACE:referrer} %{NUMBER:response} %{NUMBER:sub_response} %{NUMBER:sc_status} %{NUMBER:responsetime}"]
}

date {
match => [ "timestamp", "yyyy-MM-dd HH:mm:ss" ]
locale => "en"
}
}

Second filter

filter {
if "_grokparsefailure" in [tags] {

} else {
# on success remove the message field to save space
mutate {
 remove_field => ["message", "timestamp"]
}

}
}

output {
elasticsearch {
hosts => ["172.24.80.86:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

Please show what an example event looks like. Please don't use screenshots. Either copy/paste from the JSON tab in Kibana or use a stdout { codec => rubydebug } output that you copy/paste from.

Following is the event generated in kibana at the moment.

@timestamp:November 3rd 2016, 13:58:58.473offset:307,298@version:1input_type:logbeat.hostname:BLIFE-TESTbeat.name:BLIFE-TESTbeat.version:5.0.0host:BLIFE-TESTsource:C:\inetpub\logs\LogFiles\W3SVC2\u_ex161103.logmessage:2016-11-03 08:28:04 BLIFE-TEST 172......... POST /Medical_Payment/MedicalUI/LabTestPaymentUI.aspx - 80 172......... HTTP/1.1 http://blife-test/Medical_Payment/MedicalUI/LabTestPaymentUI.aspx blife-test 200 0 0 31186 24882 50type:logtags:beats_input_codec_plain_applied, _grokparsefailure_id:AVgpVqlj6ezIdl4F1huS_type:log_index:filebeat-2016.11.03_score: -

No fields are created because your grok expression doesn't match your input logs.

You did not copy/paste from the JSON tab in Kibana like I asked you to do. In this particular case it didn't matter but another time it might.

Sorry I couldn't find JASON tab in Kibana 5.0 Please help

Following is my raw log from IIS

2016-11-02 02:29:47 172....... GET /Secworks/Signin.asp - 80 - 172.......... Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.1;+SV1;+.NET+CLR+2.0.50727;+.NET4.0C) - 200 0 0 286
2016-11-02 02:29:55 172.......... POST /Secworks/Signin_handler.asp - 80 - 172.24.102.88 Mozilla/4.0+(compatible;+MSIE+6.0;+Windows+NT+5.1;+SV1;+.NET+CLR+2.0.50727;+.NET4.0C) http://blife-test/Secworks/Signin.asp 200 0 0 448

I tested my grok filter in http://grokconstructor.appspot.com . in their it worked. It matches the log in the way I needed. but when the same grok filter applies to logstash.conf it wasn't working.

Please help

Sorry I couldn't find JASON tab in Kibana 5.0 Please help

Maybe it has disappeared or is called something else in Kibana 5. I don't know.

Following is my raw log from IIS

I can't spot any obvious errors with your grok expression. I suggest you start with the simplest possible one, %{TIMESTAMP_ISO8601:timestamp}, and verify that it works. Then build the expression from there. At some point it's going to start failing again and then you know what part didn't work. I strongly suggest you to use a stdout { codec => rubydebug } output while testing this.

I'm bit new to grok and ELK

Could you please explain me how to use > stdout { codec => rubydebug }

should it be in the logstash.conf file or any where else. Please help

Yes, it's a Logstash output that goes in your Logstash configuration.

ok thank you magnus