ASA Single Grok Filter Not Working

Hi, so I'm using several grok filters to parse out different messages. My config is a bit messy now because I feel like i've tried everything. Logs are being properly parsed through pre-made filters and one custom one specified later on. I am trying to specify another custom one to parse the following message:

<140>Sep 30 2021 14:59:53: %ASA-4-113019: Group = group_policy, Username = username, IP = x.x.x.x, Session disconnected. Session Type: SSL, Duration: 0h:00m:40s, Bytes xmt: 7508, Bytes rcv: 786, Reason: User Requested

Here's the entire conf file with the specific issue portion having the ** around it. .

###############

input {
        udp {
                port => 10514
                type => "cisco-fw"
        }
}

###############

filter {

if "ASA-6-434004" in [message] { drop{ } }
if "ASA-6-305012" in [message] { drop{ } }


        # Extract fields from the each of the detailed message types
        # The patterns provided below are included in core of LogStash 1.4.2.
        grok {
                match => [
                        "message", "%{CISCOFW106001}",
                        "message", "%{CISCOFW106006_106007_106010}",
                        "message", "%{CISCOFW106014}",
                        "message", "%{CISCOFW106015}",
                        "message", "%{CISCOFW106021}",
                        "message", "%{CISCOFW106023}",
                        "message", "%{CISCOFW106100}",
                        "message", "%{CISCOFW110002}",
                        "message", "%{CISCOFW302010}",
                        "message", "%{CISCOFW302013_302014_302015_302016}",
                        "message", "%{CISCOFW302020_302021}",
                        "message", "%{CISCOFW305011}",
                        "message", "%{CISCOFW313001_313004_313008}",
                        "message", "%{CISCOFW313005}",
                        "message", "%{CISCOFW402117}",
                        "message", "%{CISCOFW402119}",
                        "message", "%{CISCOFW419001}",
                        "message", "%{CISCOFW419002}",
                        "message", "%{CISCOFW500004}",
                        "message", "%{CISCOFW602303_602304}",
                        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
                        "message", "%{CISCOFW713172}",
                        "message", "%{CISCOFW733100}"
                ]
        }

        **grok {**
**                match => {**
**                        "message" => [ "%%{PROG:device}-%{INT:severity}-%{POSINT:event_id}: Group = %{GREEDYDATA:vpn_group_policy}, Username = %{WORD:username}, IP = %{IP:vpn_client_public_ip}, Session %{WORD:vpn_action}. Session Type: %{GREEDYDATA:vpn_sessiontype}, Duration: %{GREEDYDATA:vpn_duration}, Bytes xmt: %{INT:vpn_session_bytes_xmt}, Bytes rcv: %{INT:vpn_session_bytes_rcv}, Reason: %{GREEDYDATA:vpn_reason}" ]**
**                         }**
**             }**
        
## Call "patterns" file in /opt/logstash/ for the following messages
## CISCOFWSFR434002_SFR - Parses SFR Requested to Drop messages that were not being parsed from messages above.
## CISCOLOGDETAILS - Parses the severity and the event id from the beginning of each line
grok {
                patterns_dir => ["/opt/logstash/patterns"]
                match => [
                        "message", "%{CISCOFWSFR434002_SFR}",
                        "message", "%{CISCOLOGDETAILS}"
                        ]
        }

## Lookup Geoip city/country/long&lat for the source address

geoip {
      source => "src_ip"
      target => "src_geoip"
      database => "/home/xxxxx/GeoLite2-City.mmdb"
      add_field => [ "[src_geoip][coordinates]", "%{[src_geoip][longitude]}" ]
      add_field => [ "[src_geoip][coordinates]", "%{[src_geoip][latitude]}"  ]
    }
    mutate {
      convert => [ "[src_geoip][coordinates]", "float"]
    }
    # do GeoIP lookup for the ASN/ISP information.
    geoip {
      database => "/home/xxxxx/GeoLite2-ASN.mmdb"
      source => "src_ip"
      target => "src_geoip_asn"
    }

## Lookup Geoip city/country/long&lat for the destination address

geoip {
      source => "dst_ip"
      target => "dst_geoip"
      database => "/home/xxxxx/GeoLite2-City.mmdb"
      add_field => [ "[dst_geoip][coordinates]", "%{[dst_geoip][longitude]}" ]
      add_field => [ "[dst_geoip][coordinates]", "%{[dst_geoip][latitude]}"  ]
    }
    mutate {
      convert => [ "[dst_geoip][coordinates]", "float"]
    }

    ## do GeoIP lookup for the ASN/ISP information.

    geoip {
      database => "/home/xxxxx/GeoLite2-ASN.mmdb"
      source => "dst_ip"
      target => "dst_geoip_asn"
    }
}

######################

## Where we send the filtered data above to.

output {
   elasticsearch { 
     hosts => ["localhost:9200"]
     index => "network-cisco-fw"
     user => "elastic"
     password => "xxxxx"
 }
}


######################

Based on the output im seeing in kibana, its using the message", "%{CISCOLOGDETAILS} portion of my config (which i've designed as like a catch all).

I am already parsing out that SFR message as defined here:
"message", "%{CISCOFWSFR434002_SFR} & I did at one time have my bolded grok filter in that /patterns file to tidy everything up but when it wasnt working i moved it into the config for easier troubleshooting. I've verified that multiple messages are being properly processed with a grok debugger so I believe my grok pattern is correct but im not sure why its choosing the CISCOLOGDETAILS message pattern instead.

Please help!!

The grok works just fine for me

           "vpn_reason" => "User Requested",
             "severity" => "4",
     "vpn_group_policy" => "group_policy",
           "vpn_action" => "disconnected",
             "username" => "username",

etc.

I know the grok works but it's not showing up correctly in kibana.

Oh boy.. This was a terrible case of user error. I was calling the wrong file in my pipeline & modifying a similar one I had copied earlier...

Its working beautifully now that I'm calling the right file.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.