Grok match and divide string to multiple fields

Hi

I want to match a syslog string and match into corresponding fields. Here is the message in question:

Jul 06 2018 08:04:18: %ASA-6-305011: Built dynamic UDP translation from any:123.123.123.123/35439 to OUTSIDE-VRF180:123.123.123.123/35439
Jul 6 00:00:02 172.28.51.197 %ASA-6-302016: Teardown UDP connection 22180105 for INSIDE-VRF4100:123.123.123.123/21588 to SDN-VRF110:123.123.123.123/902 duration 0:02:01 bytes 66

The grok filter i'm running against this is the following:

grok {
        match => ["message", "%{SYSLOGHOST:ciscotag}%{GREEDYDATA:cisco_message}"]  
      }

However, so the problem is that not all messages have timestamp and therefore i cannot match on the timestamp first. What i would like is a filter that could correctly parse each syslog message if the message contains a timestamp and also match correctly if does not contain a timestamp. Sort of like an "or" statement i think?

Thanks!

EDIT:

So i used Incremental Construction of Grok Patterns in progress to try and construct a grok filter that matches on timestamp and this is what i came up with:

^%{CISCOTIMESTAMP}%{CRON_ACTION}%{SYSLOGHOST}%{CRON_ACTION}%{SYSLOGPROG:ciscotag}%{SPACE}%{GREEDYDATA:cisco_message}$

On the grokconstructor website it successfully matches but when i try it in logstash i get some of the timestamp in "ciscotag" for example. So the match fails and it gets tagged "_grokparsefailure".

On the grokconstructor website it successfully matches but when i try it in logstash i get some of the timestamp in "ciscotag" for example. So the match fails and it gets tagged "_grokparsefailure".

What does that event look like? Use a stdout { codec => rubydebug } output to dump the raw event.

{
Jul 06 13:59:14 sealijvblog01 logstash[63437]:               "@timestamp" => 2018-07-06T11:58:50.367Z,
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                "cendotSID" => "S136156",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:            "src_interface" => "outside",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:               "cendotFQDN" => "FQDNserver.hello",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                "direction" => "outbound",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:            "dst_mapped_ip" => "123.123.123.123",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                 "src_port" => "80",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:            "src_mapped_ip" => "123.123.123.123",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                 "protocol" => "TCP",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                     "host" => "123.123.123.123",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                  "message" => "06 2018 13:58:49: %ASA-6-302013: Built outbound TCP connection 2894463360 for outside:123.123.123.123/80 (123.123.123.123/80) to inside:123.123.123.123/50274 (123.123.123.123/50274)\n",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:            "connection_id" => "2894463360",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                     "tags" => [
Jul 06 13:59:14 sealijvblog01 logstash[63437]:         [0] "syslog",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:         [1] "pre-processed",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:         [2] "Firewall",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:         [3] "ASA",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:         [4] "log01"
Jul 06 13:59:14 sealijvblog01 logstash[63437]:     ],
Jul 06 13:59:14 sealijvblog01 logstash[63437]:          "dst_mapped_port" => "50274",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                 "@version" => "1",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:        "cendotServiceName" => "firewall",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:     "syslog_severity_code" => 5,
Jul 06 13:59:14 sealijvblog01 logstash[63437]:     "syslog_facility_code" => 1,
Jul 06 13:59:14 sealijvblog01 logstash[63437]:          "src_mapped_port" => "80",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                 "ciscotag" => "Jul",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:          "syslog_severity" => "notice",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                   "action" => "Built",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                   "src_ip" => "123.123.123.123",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:            "dst_interface" => "inside",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                   "dst_ip" => "123.123.123.123",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:          "syslog_facility" => "user-level",
Jul 06 13:59:14 sealijvblog01 logstash[63437]:                 "dst_port" => "50274"
Jul 06 13:59:14 sealijvblog01 logstash[63437]: }

You can see that the ciscotag has Jul (For july) and the message contains part of the timestamp as well as the ciscotag.

Where's the _grokparsefailure tag? What does the configuration look like?

I've figured out the root cause of the error. The firewall in question which are generating the events which causes a _grokparsefailure is a ASA firepower with the SFR module. The logs from it looks either like this:

SFR requested to drop TCP packet from OUTSIDE-VRF180:123.123.123.123/80 to INSIDE-VRF4100:123.123.123.123/6532

or like this:

SFR requested ASA to bypass further packet redirection and process TCP flow from INSIDE-VRF4100:123.123.123.123/57301 to OUTSIDE-VRF180:123.123.123.123/443 locally

And if i'm not mistaken, logstash has no support for this? So i would have to right my own grok filter?

Because i want to have similiar functionality that the supported cisco patterns implement, like "action" "src_ip" "src_port" etc.

And if i'm not mistaken, logstash has no support for this? So i would have to right my own grok filter?

I have no idea what Cisco-related grok patterns Logstash has support for out of the box, but if it doesn't you'll have to roll your own.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.