_grokparsefailure while creating field using grok filter

Hi ,
I am new on logstash , I have created grok pattern for below mentioned tomcat access logs.
10.0.6.35 - - [21/Oct/2019:00:00:21 +0000] "GET /rest/V1/productlist/category/4/ar/18/20 HTTP/1.1" 200 15917 25
My logstash conf file is:

input { beats {
port => 5044
}
}

filter
{
grok
{
match => [ "message","%{IP:clientip} - - [%{HTTPDATE:times}] "%{WORD:action} /%{GREEDYDATA:api} %{WORD:protocol}/%{NUMBER:protocolNum}" %{NUMBER:status} %{NUMBER:bytes} %{NUMBER:duration}" ]

}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
elasticsearch {
stdout { codec => rubydebug }
hosts => ["172.31.30.73:9200"]
index => "logstash-%{+YYYY.MM.dd}"

}

when i start the logstash its shows _grokparsefailure and no field are creating in kibana,I want that duration ,status etc. field on kibana .

Please help me ASAP
Thanks in advanced
Vishal sharma

Are you escaping the square brackets in the pattern?

grok { match => [ "message","%{IP:clientip} - - \[%{HTTPDATE:times}\] \"%{WORD:action} /%{GREEDYDATA:api} %{WORD:protocol}/%{NUMBER:protocolNum}\" %{NUMBER:status} %{NUMBER:bytes} %{NUMBER:duration}" ] }

Yes I am removing square brackets from the pattern.
Please tell me exactly what I am doing wrong.

I have changed my grok pattern still getting same error.
match => { "message"=>"%{IP:clientip} - - [%{NOTSPACE:date} +%{INT}] "%{WORD:action} /%{GREEDYDATA:api} %{WORD:protocol}/%{NUMBER:protocolNum}" %{NUMBER:status} %{NUMBER}"}

I do not know. When I use the grok filter that I posted it successfully parses the line you posted.

ok thanks got the solution

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.