_grokparsefailure in Kibana although Grok tester returns "matched"

Hello

I have the following log file entry

2018-07-10 15:24:00.000 DEBUG 5445 --- [pool-2-thread-1] d.d.e.g.s.LoginRateLimitingService       : Exit: removeOutdatedLoginAttempts()

I have written or generated the pattern down below on the page http://grokconstructor.appspot.com/do/construction. He also indicates that he would match. But in the Kibana I see the error _grokparsefailure

The pattern:

%{EXIM_DATE}%{SPACE}%{LOGLEVEL}%{SPACE}%{NUMBER}%{SPACE}%{NOTSPACE}%{SPACE}%{JAVALOGMESSAGE}

What's wrong with my pattern?

Greetings

Please show

  • your configuration and
  • the raw event processed by Logstash (copy/paste the text from Kibana's JSON tab).

General advice for debugging grok expressions is to build them step by step, starting with the simplest possible expression (^%{EXIM_DATE} in this case). Continue adding more and more tokens until things break.

Hi Magnus,

I have once greatly simplified the inputs. conf and adjusted the Grok pattern. Unfortunately, without success.
inputs.conf

input {
        beats {
        port => "5044"
        client_inactivity_timeout=>3000
    }
}
filter {
    if [fields][log_type] == "gatewaylog" {
        grok {
            match => { "message" => "%{DATESTAMP:timestamp}\s%{LOGLEVEL:LOGLEVEL}\s%{BASE10NUM:messagenumber}\s---\s%{GREEDYDATA:log_message}"}
        }
     }
}
output {
    if [fields][log_type] == "gatewaylog" {
        if [fields][environment]=="dev" {
            elasticsearch {
                hosts => "10.192.72.207"
                index => "gateway_dev"
                user => "elastic"
                password => "password"
            }
        }
        }
}

Kibana/JSON-Output:

{
  "_index": "gateway_dev",
  "_type": "log",
  "_id": "AWSS25XnCM1edv4moGDr",
  "_version": 1,
  "_score": null,
  "_source": {
"@timestamp": "2018-07-13T08:57:00.674Z",
"offset": 3443002,
"@version": "1",
"beat": {
  "hostname": "Application--Server1",
  "name": "Application--Server1",
  "version": "5.6.3"
},
"input_type": "log",
"host": "Application-Server1",
"source": "/vol1/gateway-dev1_i01_4001/log/gateway-dev1_i01_4001.log",
"message": "2018-07-13 10:57:00.000  INFO 5247 --- [pool-2-thread-1] d.d.e.g.s.LoginRateLimitingService       : remove outdated blacklist entries",
"fields": {
  "environment": "dev",
  "log_type": "gatewaylog"
},
"type": "log",
"tags": [
  "beats_input_codec_plain_applied",
  "_grokparsefailure"
]
  },
  "fields": {
"@timestamp": [
  1531472220674
]
  },
  "sort": [
1531472220674
  ]
}

Thank you for your effort

Greetings

In this example there are two spaces between the timestamp and the loglevel but your grok expression only allows one.

II've changed the pattern nun as follows

match => {"message" => "%{DATESTAMP:timestamp}\s\s%{LOGLEVEL:LOGLEVEL}\s%{BASE10NUM:messagenumber}\s---\s%{GREEDYDATA:log_message}"}

But the result is the same:

{
  "_index": "gateway_dev",
  "_type": "log",
  "_id": "AWSTjjM-CM1edv4moIdi",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2018-07-13T12:12:06.087Z",
    "offset": 4181258,
    "@version": "1",
    "input_type": "log",
    "beat": {
      "hostname": "application1-Server1",
      "name": "application1-Server1",
      "version": "5.6.3"
    },
    "host": "application1-Server1",
    "source": "/vol1/gateway-dev1_i01_4001/log/gateway-dev1_i01_4001.log",
    "message": "2018-07-13 14:12:00.001  INFO 5247 --- [pool-2-thread-1] d.d.e.g.s.LoginRateLimitingService       : remove outdated blacklist entries",
    "fields": {
      "environment": "dev",
      "log_type": "gatewaylog"
    },
    "type": "log",
    "tags": [
      "beats_input_codec_plain_applied",
      "_grokparsefailure"
    ]
  },
  "fields": {
    "@timestamp": [
      1531483926087
    ]
  },
  "sort": [
    1531483926087
  ]
}

:unamused:

Greetings

DATESTAMP doesn't match the timestamp you have. Use TIMESTAMP_ISO8601 instead.

Had you followed my previous advice in grok expressions, "... build them step by step, starting with the simplest possible expression ..." this would've been easy to spot.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.