Logstash grok pattern not working even though testers say it should

I've been building some grok patterns to parse the /var/log/secure log file and everything has been working fine. I create the grok patterns at http://grokconstructor.appspot.com/ and even then test them at http://grokdebug.herokuapp.com/ and both sites show the patterns matching perfectly. I'm using logstash 2.1.1, elasticsearch 2.1.1 and kibana 4.3.1 all running on CentOS 7.1 using JAVA openjdk 1.8.0.65-2.b17.

I have then taken those patterns and implemented them with a filter on my logstash servers. Most of the filters work fine but SECURENETREG and SECURENETBADGE are not matching for some reason. logstash --configtest shows no problems and logstash is running normally otherwise, but when I look at entries that should match those patterns in Kibana, it seems that none of the parsing is working.

Here is my patterns file, /etc/logstash/patterns.d/secure-log.grok:

SECURETIMESTAMP %{MONTH}%{SPACE}%{MONTHDAY} %{TIME}
SECUREPROG %{PROG:program}(?:\[%{POSINT:pid}\])?
SECUREHOST %{IPORHOST:host}
SECUREBASE %{SECURETIMESTAMP:secure_timestamp}%{SPACE}%{SECUREHOST}%{SPACE}%{SECUREPROG}:
SECURESU %{SECUREBASE} (runuser: |)%{PROG:pam_program}(?:\[%{POSINT:pid}\])? session %{WORD} for user %{USER:user}( by \(uid=%{NUMBER:su_caller_uid}\)|)
SECURESUDORUN %{SECUREBASE}%{SPACE}(%{USER:user} : TTY=%{DATA} ; PWD=%{DATA} ; USER=%{USER:sudo_runas_user} ; COMMAND=%{GREEDYDATA:sudo_command}|\S+:%{SPACE}(TGT verified|error reading keytab %{GREEDYDATA}|authentication %{WORD} for '%{USER:user}'%{GREEDYDATA}|%{GREEDYDATA}user=%{USER:user}))
SECURESSHDPUBKEY %{SECUREBASE} (Found matching RSA key: %{GREEDYDATA:rsa_key}|%{WORD} publickey for %{USER:user} from %{IPORHOST:src_ip}( port %{NUMBER:port} %{WORD}( \[preauth\]|: RSA %{GREEDYDATA:rsa_key}|)|)|)
SECURESSHDREST %{SECUREBASE} (Did not receive identification string from %{IPORHOST:src_ip}|pam_unix\(sshd:session\): session %{WORD} for user %{USER:user}|Starting session: command for %{USER:user} from %{IPORHOST:src_ip}|Connection from %{IPORHOST:src_ip}|Accepted (password|publickey) for %{USER:user} from %{IPORHOST:src_ip}|Received disconnect from %{IPORHOST:src_ip}|Connection closed by %{IPORHOST:src_ip}|User child is on pid %{NUMBER}|Set %{UNIXPATH} to %{NUMBER})
SECURENETREG %{SECUREBASE} connect from %{IPORHOST:src_ip}%{GREEDYDATA}
SECURENETBADGE %{SECUREBASE} (%{WORD:whois_action}|Authentication %{WORD:auth_result}): (reply from %{URI:whois_uri}: Result: %{GREEDYDATA:whois_result}|User: %{USER:user}, %{GREEDYDATA:auth_result_detail}, From: %{IPORHOST:src_ip}, %{GREEDYDATA}, URL: %{URI:netbadge_source_uri})

Here is my config file that applies the filter, /etc/logstash/conf.d/46-filter-secure-log.conf:

filter {
  if [type] == "secure" {
    grok {
      patterns_dir => ["/etc/logstash/patterns.d/"]
      match        => { "message" => [
        "%{SECURESU}",
        "%{SECURESUDORUN}",
        "%{SECURESSHDPUBKEY}",
        "%{SECURESSHDREST}",
        "%{SECURENETREG}",
        "%{SECURENETBADGE}"
        ]
      }
      add_field    => {
        "received_at"     => "%{@timestamp}"
        "received_from"   => "%{host}"
      }
    }
    date {
      match => [ "secure_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }
}

And finally here are some (scrubbed of identifying info) log messages that match perfectly in the pattern testers but do not seem to get properly parsed in logstash (though I know the filter is hitting them because I temporarily tested by adding a new field just to make sure the messages were being run through the filter):

Jan  8 09:22:22 netbadge-serv netbadge[3534]: verify_whois: reply from https://whois.domain.edu/whois: Result: 0:-1000:0:0:Error with submitted data Illegal characters in data stream#012
Jan  8 09:22:19 netreg-serv autoreg.pl[13867]: connect from 10.250.100.22 (10.250.100.22)
Jan  8 09:22:19 netbadge-serv netbadge[3522]: Authentication success: User: mst3k, Password: Test Test, From: 10.250.28.30, Appid: webmail_login, URL: https://www.mail.domain.edu/switchboard/
Jan  8 09:39:51 netbadge-serv netbadge[11358]: Authentication failure: User: mst3k, Invalid User/Password, From: 10.250.28.31, Appid: Shibboleth Identity Provider, URL: https://shib.domain.edu/idp/Authn/RemoteUser

I'm sure it's something simple I'm missing due to staring at all this for so long, so I sure hope someone can tell me what's going on here.

Thanks,

Bob

Be systematic and start with a minimal grok expression like

^%{SECURETIMESTAMP}

and make sure that works. Then append the next token, e.g.

^%{SECURETIMESTAMP:secure_timestamp}%{SPACE}%{SECUREHOST}

and continue until things break. Doing so should narrow the problem down.

Well, those pieces all work. It can be seen in the kibana output that the log message gets through the SECUREBASE filter part, but after that is when things don't seem to work in logstash. Also, there is no _grokparsefailure getting applied so it seems that something is working on the message.

Well, those pieces all work. It can be seen in the kibana output that the log message gets through the SECUREBASE filter part, but after that is when things don't seem to work in logstash.

Right, but what if you continue to build your expression gradually, converging towards SECURENETREG or SECURENETBADGE that don't work? At some point I'd imagine that things stop working.

Also, there is no _grokparsefailure getting applied so it seems that something is working on the message.

Yeah, that's surprising. So what do you get for a SECURENETREG message? Use a stdout { codec => rubydebug } output. To improve cycle time don't involve Elasticsearch and Kibana just yet.

Figured it out. At the end of SECURESSHDPUBKEY there was an | before the ending ) that was matching everything else so it never got to the patterns further down. I knew it was simple, had just been staring at that grok too long.

Thanks!