Grok filter not filtering data

I have Elasticsearch, Logstash and Kibana all versions 5.6.2 configured and running on Windows 2012 R2. Some sample data I am trying to parse is: (filtered for sensitive info)

2017-09-18 00:00:01 Local4.Debug 00.00.00.0 Sep 18 2017 00:00:01: %ASA-0-000000: UDP request discarded from 00.00.00.00/00000 to COVERT:000.000.000.000/0000

the following is my grok pattern in my logstash config file reading from a .txt file:

grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}Local4.%{LOGLEVEL}%{SPACE}%{IP}%{SPACE}%{CISCOTIMESTAMP}: %%{CISCOTAG}%{GREEDYDATA}" }
}

It parses in the grok debugger but in Kibana the whole entry shows up as message, not separate fields. It also auto-generates the time stamp to be current but I'm reading logs that are 24 hours old. Not sure what I have going on.

If I test it on http://grokdebug.herokuapp.com/ it matches but when I enable 'Named Captures Only' I only see the timestamp. I am unsure if grok defaults to 'Named Captures Only', You might want to investigate this.

Good luck.
Paul.

It looks like named captures only defaults to true, so how would I go about setting the boolean to false?

I would guess something like this.

 filter {
      grok {
        named_captures_only => 'false'
        match => { "message" => "stuff" }
      }
    }

Worked, thank you!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.