Logstash 2.0.0 and Cisco ASA syslog

I have all my Cisco devices forwarding syslog to a central server, and then using Logstash-Forwarder to forward them to logstash. For general syslog features this works great, but I can't get logstash to properly grok the logs from my ASAs. I followed several tutorials online (such as https://jackhanington.com/blog/2014/04/21/using-logstash-elasticsearch-and-kibana-for-cisco-asa-syslog-message-analysis/), but I keep getting the dreaded _grokparsefailure tag instead. The only thing I can think of is that the logs somehow dont match the expected patterns, I tried https://grokdebug.herokuapp.com/ and it sometimes reports success, others no.

Has something changed in logstash 2? Is my input messed up? Any guidance whatsoever?

Below are a few (sanitized) lines from my syslog:
2015-11-25T11:53:09.089380-08:00 xx.xx.xx.xx %ASA-6-302015: Built outbound UDP connection 2394972349 for outside:xx.xx.xx.xx/53 (xx.xx.xx.xx/53) to inside:xx.xx.xx.xx/56848 (xx.xx.xx.xx/56848)
2015-11-25T11:53:09.089673-08:00 xx.xx.xx.xx %ASA-6-302014: Teardown TCP connection 2394972321 for outside:xx.xx.xx.xx/80 to inside:xx.xx.xx.xx/52005 duration 0:00:00 bytes 854 TCP FINs
2015-11-25T11:53:09.091128-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic TCP translation from inside:xx.xx.xx.xx/1769 to outside:xx.xx.xx.xx/1769 duration 0:01:01
2015-11-25T11:53:09.091186-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic UDP translation from Training:xx.xx.xx.xx/52497 to outside:xx.xx.xx.xx/52497 duration 0:00:31
2015-11-25T11:53:09.091186-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic UDP translation from Training:xx.xx.xx.xx/52230 to outside:xx.xx.xx.xx/52230 duration 0:00:31
2015-11-25T11:53:09.092227-08:00 xx.xx.xx.xx %ASA-6-302014: Teardown TCP connection 2394972322 for outside:xx.xx.xx.xx/80 to inside:xx.xx.xx.xx/52006 duration 0:00:00 bytes 982 TCP FINs
2015-11-25T11:53:09.097249-08:00 xx.xx.xx.xx %ASA-6-302016: Teardown UDP connection 2394972349 for outside:xx.xx.xx.xx/53 to inside:xx.xx.xx.xx/56848 duration 0:00:00 bytes 269
2015-11-25T11:53:09.107922-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic TCP translation from inside:xx.xx.xx.xx/48261 to outside:xx.xx.xx.xx/48261 duration 0:01:01
2015-11-25T11:53:09.107922-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic TCP translation from inside:xx.xx.xx.xx/1754 to outside:xx.xx.xx.xx/1754 duration 0:01:01
2015-11-25T11:53:09.124678-08:00 xx.xx.xx.xx %ASA-6-305012: Teardown dynamic TCP translation from inside:xx.xx.xx.xx/3724 to outside:xx.xx.xx.xx/45036 duration 0:01:01

What's your configuration? Are the lines above examples of failures? If not, please supply such a line.

Those lines are from the syslog file, which I assume is the source. It appears that all of the ASA logs get the _grokparsefail tag whenever I try to use a grok pattern such as:
grok {
match => [
"cisco_message", "%{CISCOFW106001}",
"cisco_message", "%{CISCOFW106006_106007_106010}",
"cisco_message", "%{CISCOFW106014}",
"cisco_message", "%{CISCOFW106015}",
"cisco_message", "%{CISCOFW106021}",
"cisco_message", "%{CISCOFW106023}",
"cisco_message", "%{CISCOFW106100}",
"cisco_message", "%{CISCOFW110002}",
"cisco_message", "%{CISCOFW302010}",
"cisco_message", "%{CISCOFW302013_302014_302015_302016}",
"cisco_message", "%{CISCOFW302020_302021}",
"cisco_message", "%{CISCOFW305011}",
"cisco_message", "%{CISCOFW313001_313004_313008}",
"cisco_message", "%{CISCOFW313005}",
"cisco_message", "%{CISCOFW402117}",
"cisco_message", "%{CISCOFW402119}",
"cisco_message", "%{CISCOFW419001}",
"cisco_message", "%{CISCOFW419002}",
"cisco_message", "%{CISCOFW500004}",
"cisco_message", "%{CISCOFW602303_602304}",
"cisco_message", "%{CISCOFW710001_710002_710003_710005_710006}",
"cisco_message", "%{CISCOFW713172}",
"cisco_message", "%{CISCOFW733100}"
]
}

########################### Logstash config ###################
input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter {
if "syslog" in [tags] and "pre-processed" not in [tags] {
if "%ASA-" in [message] {
mutate {
add_tag => [ "pre-processed", "Firewall", "ASA" ]
add_field => [ "syslog_raw_message", "{message}" ]
}
syslog_pri { }
grok {
patterns_dir => "/opt/logstash/patterns/custom"
match => ["message", "%{GREEDYDATA:cisco_message}"]
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }

stdout { codec => rubydebug }

}

I think I may have solved my own problem (hurray!)

I was looking at https://jackhanington.com/blog/tag/logstash/, and he creates a script that inputs a mapping? into logstash. I did this but was still getting errors, until I noticed that he tagged his traffic type as "cisco-fw". I performed the same and viola!

So far there have been no _grokparsefailure or lines appearing in my logstash.log

@edgoad can you give me your config plz your input and filter and output i'm lost ,
i have filebeat working fine and i want to add cisco asa config
need help plez