Cisco asa - grok failure probably

Hey guys!

I am having a hard time with the logstash and a syslog file which gets info from multiple cisco routers/firewalls.

syslog file is like this:

Aug 3 00:00:03 %ASA-6-305011: Built dynamic TCP translation from inside: to outside:
Aug 3 00:00:03 %ASA-6-106100: access-list adout permitted tcp tdd-rd-db/ -> ad/ hit-cnt 1 first hit [0x622c7137, 0x2b314401]
Aug 3 00:00:03 %ASA-6-302013: Built inbound TCP connection 994087149 for rrm-pr-db: ( to ad: (

#logstash is like this
input {
beats {
port => 5044
type => "cisco-fw"

filter {
grok {
match => ["message", "%{CISCO_TAGGED_SYSLOG} %{GREEDYDATA:cisco_message}"]

    grok {
            match => [
                    "cisco_message", "%{CISCOFW106001}",
                    "cisco_message", "%{CISCOFW106006_106007_106010}",
                    "cisco_message", "%{CISCOFW106014}",
                    "cisco_message", "%{CISCOFW106015}",
                    "cisco_message", "%{CISCOFW106021}",
                    "cisco_message", "%{CISCOFW106023}",
                    "cisco_message", "%{CISCOFW106100}",
                    "cisco_message", "%{CISCOFW110002}",
                    "cisco_message", "%{CISCOFW302010}",
                    "cisco_message", "%{CISCOFW302013_302014_302015_302016}",
                    "cisco_message", "%{CISCOFW302020_302021}",
                    "cisco_message", "%{CISCOFW305011}",
                    "cisco_message", "%{CISCOFW313001_313004_313008}",
                    "cisco_message", "%{CISCOFW313005}",
                    "cisco_message", "%{CISCOFW402117}",
                    "cisco_message", "%{CISCOFW402119}",
                    "cisco_message", "%{CISCOFW419001}",
                    "cisco_message", "%{CISCOFW419002}",
                    "cisco_message", "%{CISCOFW500004}",
                    "cisco_message", "%{CISCOFW602303_602304}",
                    "cisco_message", "%{CISCOFW710001_710002_710003_710005_710006}",
                    "cisco_message", "%{CISCOFW713172}",
                    "cisco_message", "%{CISCOFW733100}"

the problem is that Kibana is storing the message like this and not with the fields from the message itself (ip, dst, etc.)

@timestamp August 3rd 2018, 12:54:59.944
t @version 1
t _id p7xt_2QBG-A9L2YKJP45
t _index logstash-2018.08.03
# _score -
t _type doc
t beat.hostname server111
t server111
t beat.version 6.3.2
t fields.env prod
t server111
t input.type log
t message Aug 3 03:39:48 %ASA-6-106100: access-list dst-out permitted udp mggmt/ -> dis-c-link/ hit-cnt 1 first hit [0x1a11c48c, 0x00000000]
# offset 7,910,884,853
t prospector.type log
t source /data/kibana/log/syslog
t tags beats_input_codec_plain_applied, _grokparsefailure, _geoip_lookup_failure
t type cisco-fw

any ideas?

thank you!

Your grok is not able to parse the data. Is this happening for all ASA logs or just a few?

I only have 1 log file, that syslog that gets dump from all over the Cisco devices. no other logs.

See the definition of CISCO_TAGGED_SYSLOG:

Your logfile doesn't have a priority field at the beginning of each line so that pattern won't work in your case,

1 Like

Good to know, any tips how to fix this issue?

the thing is I am not quite getting it how to get the grokdebug thing inside the logstash config file..

Could you try with the following config?

%{CISCOTIMESTAMP:timestamp} %{SYSLOGHOST:sysloghost} %%{CISCOTAG:ciscotag}: %{GREEDYDATA:cisco_message}

i tried that in the grokdebug and it matches! you are the grok master wizard yooou NerdSec :slight_smile: thank you so much!

i will edit this line and it should work.

match => ["message", "%{CISCO_TAGGED_SYSLOG} %{GREEDYDATA:cisco_message}"]
to this
match => ["message", "%{CISCOTIMESTAMP:timestamp} %{SYSLOGHOST:sysloghost} %%{CISCOTAG:ciscotag}: %{GREEDYDATA:cisco_message}"]

i will get back with the result.
will delete the the elasticsearch data first..

okey, the result is better now I will try to figure out some nice kibana dashboards.

i should be a happy camper and forget about it but maybe some details can help me and others going further.
"CISCOTIMESTAMP" or "GREEDYDATA" is that hardcoded in Logstash itself?
the documentation on this is quite weak : Grok filter plugin | Logstash Reference [7.15] | Elastic, how do you guys build your grok queries?

thanks for your support guys!


Refer to this link. It has all the patterns shipped with logstash. This is also present in the documentation for grok.

I usually use the Grok debugger to build my patterns. But I usually prefer something like dissect, kv, or similar filters. Here is a nice post about grok and dissect.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.