Missing fields while Indexing Tomcat access logs data in elasticsearch

I have localhost_access_log.2016-03-23.txt file which I am trying to index in elasticsearch via logstash.

Above file looks as follows:-
0:0:0:0:0:0:0:1 - - [23/Mar/2016:12:20:46 +0530] "GET / HTTP/1.1" 200 11444
0:0:0:0:0:0:0:1 - admin [23/Mar/2016:12:24:42 +0530] "GET /manager/status HTTP/1.1" 200 6595
10.76.170.72 - - [23/Mar/2016:15:48:54 +0530] "GET / HTTP/1.1" 200 11444

For that I have created the configuration file as follows:-
input{
file{
type=>"access-log"
path=>"D:/apache-tomcat-7.0.37/logs/localhost_access_log.2016-03-23.txt"
codec=>multiline{
negate=>true
pattern=>"(^([0-9A-Fa-f]{1,4}:){7}[0-9A-Fa-f]{1,4}|(\d{1,3}.){3}\d{1,3})"
what=>"previous"
}
}
}
filter{
if [type] == "access-log" {
grok {
match => [ "message", "%{IP:client} %{NOTSPACE:user} %{NOTSPACE:remoteUser} [%{DATA:timestamp}] %{WORD:method} %{NOTSPACE:request}
%{NUMBER:status} %{NUMBER:bytes}" ]
}
}
}
output{
stdout{}
elasticsearch{
hosts=>"localhost"
index=>"access-logs_20160328"
}
}

But when I configure index in Kibana I am not able to see fields like client, user, etc in kibana

Reduce the complexity of the system. Comment out the elasticsearch output and replace it with a stdout { codec => rubydebug } output. What do the events look like then?

As per your reply, I got the following output[sample] :-
{
"message" => "0:0:0:0:0:0:0:1 - admin [23/Mar/2016:13:30:06 +0530] "GET /host-manager/html HTTP/1.1" 403 286928690:0:0:0:0:0:0:1 - admin [23/Ma
r/2016:13:30:06 +0530] "GET /host-manager/html HTTP/1.1" 403 2869\r",
"@version" => "1",
"@timestamp" => "2016-03-29T12:43:38.861Z",
"host" => "DIN03000865",
"path" => "D:/apache-tomcat-7.0.37/logs/localhost_access_log.2016-03-23.txt",
"type" => "access-log",
"tags" => [
[0] "_grokparsefailure"
]
}

Okay, so this establishes that the grok expression doesn't match the input. Looking more closely at it,

 match => [ "message", "%{IP:client} %{NOTSPACE:user} %{NOTSPACE:remoteUser} [%{DATA:timestamp}] %{WORD:method} %{NOTSPACE:request} %{NUMBER:status} %{NUMBER:bytes}" ]

it's clear that the part after %{NOTSPACE:request} isn't correct since you're ignoring the "HTTP/1.1" that follows. I also hope you're escaping the square brackets around the timestamp field with backslashes.

I changed the configuration as follows:- Now getting success..
input{
file{
type=>"access-log"
path=>"D:/apache-tomcat-7.0.37/logs/localhost_access_log.2016-03-23.txt"
}
}
filter{
if [type] == "access-log" {
grok {
match => [ "message", "%{COMMONAPACHELOG}" ]
}
}
}
output{
stdout{ codec=>rubydebug }
elasticsearch{
hosts=>"localhost"
index=>"access-logs_20160329"
}
}

But it would be nice if you would assist me with the problem in my previous config

I used grok debugger for generating the grok pattern %{COMMONAPACHELOG}. But I am failing to understand how we create these patterns by self. Is there any reference u can point to for creating grok patterns and patterns we define under multiline codec for beginners.

Have you looked at the grok filter's documentation? It seems to describe things on a rather basic level.