"_grokparsefailure" even though the grok pattern matches

Hi everyone,

I'm encountering the following problem: I've been testing my grok filter including my patterns on both Grok Debugger and Grok Constructor which works fine. Also, when running logstash with option -t no error returns. But when running logstash, I still receive a _grokparsefailure in my tags. So my question is: How is that and how can I start debugging (as I'm pretty sure that my filters/patterns are ok)?

I'm using this logstash config

file { 
    path => "/media/SAP/log/security_audit_log/*"
    type => "security_audit_log"
    start_position => "beginning"
    codec => plain {
        charset => "ISO-8859-1"
filter {
  if [type] == "security_audit_log" {
    mutate {
      gsub => [
        replace => "message", "  ", ""
    grok {
	  patterns_dir => "/media/ELK/logstash-5.5.1/config/paterns/*"
      match => { "message" => "\|%{SAPDATE:date}\|%{TIME:time}\|%{CLIENT:client}?\|%{USER:username}?%{SPACE}?\|%{TERMINAL:terminal}?%{SPACE}?\|%{TCODE:tcode}?%{SPACE}?\|%{PROGRAM:program}?%{SPACE}?\|%{AUDITCLASS:auditclass}%{SPACE}?\|%{SECURITYLEVEL:securitylevel}%{SPACE}?\|%{MESSAGETEXT:messagetext}%{SPACE}?\|"
	mutate {
	  add_field => {
	    "logtimestamp" => "%{date} %{time}"
      remove_field => [ "date", "time" ]
    date {
      match => [ "logtimestamp", "dd.MMM.yyyy HH:mm:ss" ]
      timezone => "Europe/Berlin"
      locale => "en"
      target => "@timestamp"
output {
  if [type] == "security_audit_log" {
    elasticsearch {
      hosts => "localhost:9200"
      index => "%{type}-%{timestamp}"
  stdout {
    codec => "rubydebug"

with these patterns

CLIENT \d{3}
TERMINAL [\w+.\-]+
TCODE [A-Z0-9_]+
PROGRAM [A-Z0-9_/]+
AUDITCLASS (\w+\s?\-?/?){1,3}
SECURITYLEVEL (\w+\s?){1,3}

And here you've got some log lines (anonymized)

|Date      |Time    |Cl.|User         |Terminal            |TCode          |Program                      |Auditclass         |Security Level  |AuditLog-Messagetext                                                            |
|20.07.2017|08:01:37|   |             |                    |               |                             |System-Ereignisse  |Hoch            |Applikationsserver gestartet                                                      |
|24.07.2017|11:17:05|000|SOMEUSER     |SOMETERMINAL        |SM19           |SAPMSM19                     |System-Ereignisse  |Hoch            |Audit Konfiguration geändert                                                      |
|24.07.2017|11:17:05|000|SOMEUSER     |SOMETERMINAL        |SM19           |SAPMSM19                     |System-Ereignisse  |Hoch            |Audit: Slot 1: Klasse 191, Gewicht 5, User * , Mandant 000,                       |
|24.07.2017|11:17:05|000|SOMEUSER     |SOMETERMINAL        |SM19           |SAPMSM19                     |System-Ereignisse  |Hoch            |Audit Konfiguration geändert                                                      |
|24.07.2017|11:17:05|000|SOMEUSER     |SOMETERMINAL        |SM19           |SAPMSM19                     |System-Ereignisse  |Hoch            |Audit: Slot 2 : inaktiv                                                           |
|24.07.2017|11:19:19|000|SOMEUSER     |                    |               |RSBTCRTE                     |Dialoganmeldung    |Mittel          |Login erfolgreich (Typ=B, Methode=A )                                             |
|24.07.2017|11:24:14|000|SOMEUSER     |                    |               |RSBTCRTE                     |Dialoganmeldung    |Mittel          |Login erfolgreich (Typ=B, Methode=A )                                             |

Is _grokparsefailure showing in all of the log messages or only specific ones?

Hi @CDR,

it's present in evey single log message. Without exception.

The question mark after the %{SPACE} is what first sticks out to me. The field %{SPACE}? means that there will be zero or one matches of SPACE. In your example logs it seems that there are more than a single space in those fields. Perhaps a better quantifier would be the *. So it would be %{SPACE}*. This would mean "there is zero or more spaces until the |"

I applied your correction but atm it looks like the behavior/_grokparsefailure did not change. Unfortunately, I lost the connection to the machine that's running my ELK stack, so I'll have a look at the output file later.

Any particular reason you're using a grok filter instead of a csv filter?

To debug grok problems start simple, e.g. with ^\|%{SAPDATE:date}. Does that work? If yes, add the next token.

Hi @magnusbaeck,

thanks for your reply. Even starting with ^\|%{SAPDATE:date} gives me a _grokparsefailure. I also commented the mutate and date filter out. To be honest, I wasn't sure whether the csv filter would be applicable for my case as I thought it would be better if the fields match a specific pattern instead of looking for columns.

Best regards,


This works for me:

$ cat test.config 
input { stdin { } }
output { stdout { codec => rubydebug } }
filter {
  grok {
    match => ["message", "^\|%{MONTHDAY}.%{MONTHNUM}.%{YEAR}"]
$ echo '|20.07.2017' | /opt/logstash/bin/logstash -f test.config
Settings: Default pipeline workers: 8
Pipeline main started
       "message" => "|20.07.2017",
      "@version" => "1",
    "@timestamp" => "2017-08-14T06:36:13.724Z",
          "host" => "lnxolofon"
Pipeline main has been shutdown
stopping pipeline {:id=>"main"}

I guess I figured out the root cause of this problem: When commenting out my patterns_dir line, everything works fine, no _grokparsefailure when not expected.

Some background information: First, I encountered the issue that my patterns have not been "added" when they were located in the path as given in patterns_dir. After having a look at the output with the --debug option it seemed like logstash was looking for patterns in /media/ELK/logstash-5.5.1/patterns while my custom path wasn't given anywhere. So I moved my patterns to the path as expected by Logstash. (and yes, the pattern files are identical)

Thank you all for your support!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.