Help with Logstash Grok

I'm sure its something I'm missing but I can't seem to figure this out. I'm trying to build a grok filter for some syslog input from a system. Grokdebugger validates, and so do the other 3 online grok validators I used but when I put the filter in, it throws a _grokparsefailure.

So my original grok filter is quite a bit longer than what is in here because I simplified it to find he issue. The current issue seems to be that it will not allow me to escape the slash no matter what I do. If I go into the filter and delete the two slashed just before the &{GREEDYDATA:WHAT} it works fine, the moment I try to escape that backslash it fails. Everything I read says that should work. What am I missing? Thanks in advance!

Using 6.4 version of logstash on ubuntu 16.04

<118>Oct 5 09:25:44 SYSTEM THING: [ZC@0 event="ZC-Object accessed (change)" event_type="C-Change object" sev="2" actual_type="ZC-C" object_name="OBJECT" object_library="LIBRARY" object_type="*FILE" access_type="Open" specific_data="" jrn_seq="14245869" timestamp="20181005092544262000" job_name="THISJOB" user_name="USERACCOUNT" job_number="428949" eff_user="USERACCOUNT" logical_partition="004" ip_addr="x.x.x.x" port="55515"]",

My filter looks like this:
input {
tcp {
host => ["x.x.x.x"]
port => 5000
type => syslog
}
udp{
host => ["x.x.x.x"]
port => 5000
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:Host} %{WORD:Journal}: [%{DATA:Event_Code} event=\%{GREEDYDATA:WHAT}]"}
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch { hosts => ["x.x.x.x:9200"] }
stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.