Grok does not work

Hello,
i had such log row ( for example ) 10.128.0.23 - - [16/May/2017:12:24:16 -0000] "GET /pp/api/personalprofile/update_status/na/Sweaty%20ASol HTTP/1.1" 101 361"

  • and I use such grok for parsing

%{IP:clientip} \- \- \[%{HTTPDATE:timestamp}\] "%{WORD:action} /%{GREEDYDATA:message} %{WORD:protocol}/%{NUMBER:protocolNum}" %{NUMBER:status} %{NUMBER}

Whore filter

filter {
grok{
match=>{
"message" => "%{IP:clientip} \- \- \[%{NOTSPACE:date} \+%{INT}\] \"%{WORD:action} /%{WORD}/%{WORD}/%{NOTSPACE:verb} %{WORD:protocol}/%{NUMBER:protocolNum}\" %{NUMBER:status} %{NUMBER}"
}
add_field=>{
"eventName"=>"grok"
}
}
geoip {
source => "clientip"
}
}

I checked this grok with https://grokdebug.herokuapp.com/ and it works fine
However i getting such error on Logstash 5.4
[2017-05-16T17:11:29,774][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, {, } at line 4, column 63 (byte 87) after filter {\ngrok{\nmatch=>{\n\"log\" => \"%{IP:clientip} \\- \\- \\[%{NOTSPACE:date} \\+%{INT}\\] \""}

Please advice - what i 'm doing wrong ?
Thank you in advance.

Why not use the built-in COMMONAPACHELOG grok pattern?

filter {
  grok { match => { "message" => "%{COMMONAPACHELOG}" } }
  date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] }
}

I put a simple configuration together and it works:

input { stdin {} }

filter {
  grok { match => { "message" => "%{COMMONAPACHELOG}" } }
  date { match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] }
}

output { stdout { codec => rubydebug } }

Here's what it looks like when I paste your sample from above:

[2017-05-16T11:36:23,922][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9601}
10.128.0.23 - - [16/May/2017:12:24:16 -0000] "GET /pp/api/personalprofile/update_status/na/Sweaty%20ASol HTTP/1.1" 101 361
{
        "request" => "/pp/api/personalprofile/update_status/na/Sweaty%20ASol",
           "auth" => "-",
          "ident" => "-",
           "verb" => "GET",
        "message" => "10.128.0.23 - - [16/May/2017:12:24:16 -0000] \"GET /pp/api/personalprofile/update_status/na/Sweaty%20ASol HTTP/1.1\" 101 361",
     "@timestamp" => 2017-05-16T12:24:16.000Z,
       "response" => "101",
          "bytes" => "361",
       "clientip" => "10.128.0.23",
       "@version" => "1",
           "host" => "localhost.local",
    "httpversion" => "1.1",
      "timestamp" => "16/May/2017:12:24:16 -0000"
}

As an aside, this is how that pattern is built:

COMMONAPACHELOG %{IPORHOST:clientip} %{HTTPDUSER:ident} %{USER:auth} \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)

You can see how each of these meta-patterns are built in the github repository.

Hi,

@Igor_Gerasimow

did you tried to use single quotes instead of double ons in message expression? So, instead of

should be

"message" => '%{IP:clientip} - - [%{NOTSPACE:date} +%{INT}] "%{WORD:action} /%{WORD}/%{WORD}/%{NOTSPACE:verb} %{WORD:protocol}/%{NUMBER:protocolNum}" %{NUMBER:status} %{NUMBER}'

Well - it is nginx web server log - is it will be work ?
BTW - not it is impossible to change log format.

nginx uses the same basic format.

Hi - no this not gonna work because in my log could be [16/May/2017:12:24:16 -0000]
and your example dill drop it, because of
"message" => '%{IP:clientip} - - [%{NOTSPACE:date} +%{INT}] \"

Can you post output of command

/usr/share/logstash/bin/logstash --path.config /etc/logstash/your_logstash_config.conf --config.test_and_exit

Change path to one that corresponds to your config file, of course

i did it.

/usr/share/logstash/bin/logstash --path.settings /etc/logstash/ --config.test_and_exit

Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties

You didn't specified exact config file, you have to do it.

See results in Logstash log file and post it

Hi - you right :slight_smile:

10.128.0.17 - - [19/May/2017:12:29:12 +0000] "GET /public/a245afb093cb3064f1909c02782cbc63.jpg HTTP/1.1" 200 175003 "https://site.domain.st/profile/euw/BornToDieftw" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36" "-"
{
        "request" => "/public/a245afb093cb3064f1909c02782cbc63.jpg",
          "agent" => "\"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36\"",
           "auth" => "-",
          "ident" => "-",
           "verb" => "GET",
        "message" => "10.128.0.17 - - [19/May/2017:12:29:12 +0000] \"GET /public/a245afb093cb3064f1909c02782cbc63.jpg HTTP/1.1\" 200 175003 \"https://site.domain.st/profile/euw/BornToDieftw\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.36\" \"-\"",
       "referrer" => "\"https://stats.mobalytics.gg/profile/euw/BornToDieftw\"",
     "@timestamp" => 2017-05-19T12:29:12.000Z,
       "response" => "200",
          "bytes" => "175003",
       "clientip" => "10.128.0.17",
       "@version" => "1",
           "host" => "elasticsearch-logs",
    "httpversion" => "1.1",
      "timestamp" => "19/May/2017:12:29:12 +0000"
}

but in kibana i still does not see parsing result :frowning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.