Grok Filter Parsing Failure

Hi,

I was able to integrate Filebeat/logstash/elastic search and kibana and able to parise logs using default APACHE LOG filter.

Next I tried to parse specific information and push it to elastic search.

However, logstash console shows me a grok filter failure message:

My grok pattern in logstash.conf file is:

filter {
grok {
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}"}
}
}

The exception thrown is as follows:

{
"@timestamp" => 2017-02-22T09:51:22.926Z,
"offset" => 2995,
"@version" => "1",
"input_type" => "log",
"beat" => {
"hostname" => "LT0004658",
"name" => "LT0004658",
"version" => "5.2.0"
},
"host" => "LT0004658",
"source" => "D:\Softwares\LogMonitoring\logs\access-161212.log",
"message" => "10.129.34.54 10.129.1.180 - - [12/Dec/2016:12:55:56 +0530]
"GET /_ui/desktop/theme-marketplace/img/fulfilled.png HTTP/1.1" 200 1824 "htt
ps://uat-jiomall.ril.com/search/?search-category=All&keyword=shirt" "Mozilla/5
.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.28
40.99 Safari/537.36" "image/png" 200 4189 [Response:0]",
"type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
]
}

Any help in this regard is much appreciated.

Thanks.
Saurabh

Why not start with the standard Apache Combined pattern instead of building everything from scratch? Your current log entry looks like a standard Apache Combined line with an extra IP address in the beginning and a couple of extra fields at the end.

So you mean to say use the following format and it will parse everything for me ?

input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
}

output {
elasticsearch {
hosts => "localhost:9200"
#manage_template => false
#index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
#document_type => "%{[@metadata][type]}"
}

stdout { codec => rubydebug }
}

No, but you can use COMBINEDAPACHELOG as a starting point for your expression. Start by changing the pattern to %{IP:whatever} %{COMBINEDAPACHELOG} to pick up the extra IP address. Does that work so far? Then continue with the extra fields at the end of the line.

How do we use key/value pairs in the match format ?

For instance: %{IP:xyz}, in this is IP a predefined value which needs to be used? Similarly what value will be "xyz" in this example?

Let's say in apache log format, I use a custom log format %T which prints the time taken to serve the request. Now I want to parse this in logstash, how will I do it is the question?

Thanks in advance.

For instance: %{IP:xyz}, in this is IP a predefined value which needs to be used?

IP is a grok pattern. They are defined here: https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns

Similarly what value will be "xyz" in this example?

That's the name of the field where the string matched by the pattern will be placed.

Let's say in apache log format, I use a custom log format %T which prints the time taken to serve the request. Now I want to parse this in logstash, how will I do it is the question?

%T expands to a floating point number, right? Then NUMBER would be a suitable pattern:

%{NUMBER:duration}

Additionally, to make sure the field is stored as a floating point number (rather than a string):

%{NUMBER:duration:float}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.