Help with GROK pattern - Cisco ASA CX

Hi All
I have a new ELK installation and all is going well, have a couple of windows servers and firewall sending logs. Once I had told the installation to use more RAM all is running really good.

Trying to send logs from our Cisco CX web filtering but believe I need to create a new grok pattern as currently getting grokparsefailure

the log file that is being produced is below

<142>1 2016-05-16T13:54:30.578Z asacx-2 CiscoNGFW 20135 6 [ngfwEvent@9 Flow_Dst_Service="tcp/80" Flow_Bytes_Sent="196" Event_Type="0" Flow_DstIp="92.123.140.146" Flow_SrcIp="192.168.120.40" Count="1" Url_Category_Name="Software Updates" Flow_Bytes="196" Web_Reputation_Threat_Type="" Avc_Tag_Name="" Ev_SrcLabel="ASA CX" Event_Type_Name="HTTP Deny" Auth_Realm_Name="Adrian Flux Realm" User_Realm="Adrian Flux Realm\Eilidh Alexander" Policy_Name="Deny Internet Access" Flow_Transaction_Id="0" Url="http://armdl.adobe.com/pub/adobe/reader/win/11.x/11.0.16/misc/AdbeRdrUpd11016.msp" Identity_Source_Name="AD Agent" Auth_Policy_Name="Default" Flow_SrcIfc="inside" Flow_ConnId="245191611" Identity_Type_Name="Passive" Flow_DstHostName="armdl.adobe.com" Flow_Transaction_Count="1" Ev_Id="80756764" AAA_User="Eilidh Alexander" Web_Reputation_Score="4.1" Event_Type_Action="Deny" Ev_GenTime="1463406782823" Flow_DstPort="80" Flow_DstIfc="outside" Ev_SrcId="24" Avc_App_Name="HyperText Transfer Protocol" Ev_SrcHwType="ASA-CX" Flow_SrcPort="51463" Smx_Config_Version="544" Flow_Requests_Denied="1" Avc_App_Type="Infrastructure" Connection_Dst_Service="" Flow_Protocol="tcp" Ev_Producer_Name="HTTP Inspector"]

I would like to grok this so we can have all the headings as fields in kibana, eg user_realm, Flow_dstport

Believe I need to do something with the filter grok {
match => ["message", ..................]
}

Thanks in advance

Mark

I found https://grokdebug.herokuapp.com/ and Test grok patterns very useful when writing my GROK filters.

There is a lot of info there, so I'd make use of match => ["message" , "%{GREEDYDATA:message}"] then try and filter out the useful bits.

I've made a start here for an example of how I start to construct these:

<142>1 %{TIMESTAMP_ISO8601:time}Z%{GREEDYDATA:data} [ngfwEvent@%{GREEDYDATA:more_stuff_to_filter}

1 Like

Thanks for this, sorry but complete newbie to this.
In my config file for the logs would I need to do match => ["message", <142>1 %{TIMESTAMP_ISO8601:time}Z%{GREEDYDATA:data} [ngfwEvent@%{GREEDYDATA:more_stuff_to_filter}]

I understand I need to split up the logfile, its just the how do I do that

Thanks

You would want to use grokdebug to get the fields pulled out the way you want, and then exactly as you think.

match => [ "message", "%{blah}%{blah} ]

I haven't messed with logs from the ASAx models, but my experience with the ASA has been that the messages are not completely uniform, as with a lot of systems, so you will likely need to have multiple grok patterns tried. They will attempt to match from the top down until they get a successful match, so you want to put your most specific ones at the top, and have a catchall that should match any message that doesn't get matched with a more specific one. You can change the config (logstash) so that it doesn't stop after a successful match, but that sounds like a horrible idea.

So you want...

match => [
   "message", "%{most} %{specific} %{and/or} %{most} %{common}",
   "message", "%{less}",
   "message", "%{catch} %{all}"
]
overwrite => [ "message" ]

Roughly, a catchall may be something like

%{SYSLOG5424PRI}%{INT:whatever_this_is} %{TIMESTAMP_ISO8601:time} %{GREEDYDATA:message}

EDIT:
I forgot to mention to keep looking for a tag value _grokparsefailure to see any messages that aren't being matched.

another tip for your example is create something to match all those blah=blah fields.

something like

asaXfield %{DATA:field}="%{DATA:value}"

Then you can use them is others like

%{SYSLOG5424PRI}%{INT:something} %{TIMESTAMP_ISO8601:time} %{DATA:something1} %{DATA:something2} %{INT:something3} %{INT:something4} \[%{DATA:something5} %{asaXfield}

I couldn't find any great documentation on the patterns themselves. Basically regex. However, you can look at the ones that are included and get an idea how it works.

1 Like

Thanks for the information given, I have been able to create a pattern using the grok debugger
however when trying the enter this into logstash config it is erroring when doing the %{DATA:field}="%{DATA:something}" as I think the filter sees the " as the end of the line.

config line so far, that doesn't work
match => ["message", "%{SYSLOG5424PRI}%{INT:logtype} %{TIMESTAMP_ISO8601} %{HOSTNAME} CiscoNGFW %{INT:lognumber} %{NOTSPACE} %{DATA:field}="%{DATA:Flow_Dst_Service}" %{GREEDYDATA:cxmessage}"]

config line that works
match => ["message", "%{SYSLOG5424PRI}%{INT:logtype} %{TIMESTAMP_ISO8601} %{HOSTNAME} CiscoNGFW %{INT:lognumber} %{NOTSPACE} %{GREEDYDATA:cxmessage}"]

How do I add the %{DATA:field}="%{DATA:Flow_Dst_Service}" part into the config

Thanks again

Mark

I'm not where I can confirm it, but I'm pretty sure you can escape those out.

match => ["message", "%{SYSLOG5424PRI}%{INT:logtype} %{TIMESTAMP_ISO8601} %{HOSTNAME} CiscoNGFW %{INT:lognumber} %{NOTSPACE} %{DATA:field}=\"%{DATA:Flow_Dst_Service}\" %{GREEDYDATA:cxmessage}"

Note the " where you want a literal " in your pattern.

Someone in this forum adviced me to use kv plugin.
It's more easy to use :slight_smile:

kv {
allow_duplicate_values => false
}

1 Like

Hi Alex_Lum
You absolute legend.

That has worked perfectly. I have had to to do a match filter to strip out some unwanted info then do the kv filter on the remaining message.

So happy can now get on building dashboard management want
Thanks again
Mark