Hi
We want to have a directory with several files containing different grok expressions for different devices and vendors. So we created a directory called "patterns" in "/etc/logstash/conf.d/". In that folder we have two files at the moment, "file1" and "file2".
"file1" contains the following grok expressions:
SFR_DROP %{CISCOTIMESTAMP:localtime}.%{IPORHOST:host}.%%{CISCOTAG:ciscotag}: SFR requested to %{WORD:action} %{WORD:protocol} packet from %{IPORHOST:src_interface}:%{IPORHOST:src_ip}/%{NUMBER:src_port}
SFR_BYPASS %{CISCOTIMESTAMP:localtime}.%{IPORHOST:host}.%%{CISCOTAG:ciscotag}:.%{WORD}.%{WORD}.%{WORD}.%{WORD}.%{WORD:bypass}.%{WORD}.%{WORD}.%{WORD}.%{WORD}.%{WORD}.%{WORD:protocol}.%{WORD}.%{WORD}.%{IPORHOST:src_interface}:%{IPORHOST:src_ip}/%{NUMBER:src_port} %{WORD}.%{IPORHOST:dst_interface}:%{IPORHOST:dst_ip}/%{NUMBER:dst_port}.%{GREEDYDATA}
SFR_DROP_NO_HOST_IN_SYSLOG %{CISCOTIMESTAMP:localtime}: %%{CISCOTAG:ciscotag}: SFR requested to %{WORD:action} %{WORD:protocol} packet from %{IPORHOST:src_interface}:%{IPORHOST:src_ip}/%{NUMBER:src_port}
Here are some example logs:
Jul 19 2018 11:09:39: %ASA-4-434002: SFR requested to drop TCP packet from OUTSIDE-VRF180:123.123.123.123/80 to INSIDE-VRF4100:123.123.123.123/14927
And "file2" contains the following grok expressions:
ASSIGNED_SESSION %{CISCOTIMESTAMP:localtime} %{IPORHOST:host} %%{CISCOTAG:ciscotag}: Group <%{HOSTNAME:group}> User <%{EMAILADDRESS:user}> IP <%{IP:src_ip}> IPv4 Address <%{IP:vpn_ip}> IPv6 address <%{IPV6:vpn_ipv6}> %{GREEDYDATA:message}
NO_IPV6 %{CISCOTIMESTAMP:localtime} %{IPORHOST:host} %%{CISCOTAG:ciscotag}: TunnelGroup <%{USER:tunnel_group}> GroupPolicy <%{USER:group_policy}> User <%{EMAILADDRESS:user}> IP <%{IP:src_ip}> %{GREEDYDATA:message}
And in the "main" folder of logstash "/etc/logstash/conf.d" we have the following file which makes use of the custom patterns previously created:
filter {
if "syslog" in [tags] and "pre-processed" not in [tags] {
if "%ASA-" in [message] {
mutate {
gsub => [
"message", "<161>", "",
"message", "<162>", "",
"message", "<163>", "",
"message", "<164>", "",
"message", "<165>", "",
"message", "<166>", "",
"message", "<167>", "",
"message", "<168>", "",
"message", "<169>", ""
]
add_tag => [ "pre-processed", "Firewall", "ASA" ]
}
grok {
patterns_dir => ["/etc/logstash/conf.d/patterns"]
match => [
"message", "%{CISCOFW106001}",
"message", "%{CISCOFW106006_106007_106010}",
"message", "%{CISCOFW106014}",
"message", "%{CISCOFW106015}",
"message", "%{CISCOFW106021}",
"message", "%{CISCOFW106023}",
"message", "%{CISCOFW106100}",
"message", "%{CISCOFW110002}",
"message", "%{CISCOFW302010}",
"message", "%{CISCOFW302013_302014_302015_302016}",
"message", "%{CISCOFW302020_302021}",
"message", "%{CISCOFW305011}",
"message", "%{CISCOFW313001_313004_313008}",
"message", "%{CISCOFW313005}",
"message", "%{CISCOFW402117}",
"message", "%{CISCOFW402119}",
"message", "%{CISCOFW419001}",
"message", "%{CISCOFW419002}",
"message", "%{CISCOFW500004}",
"message", "%{CISCOFW602303_602304}",
"message", "%{CISCOFW710001_710002_710003_710005_710006}",
"message", "%{CISCOFW713172}",
"message", "%{CISCOFW733100}",
"message", "^%{ASSIGNED_SESSION}$",
"message", "^%${NO_IPV6}",
"message", "^%{SFR_DROP}$",
"message", "^%{SFR_BYPASS}$"
]
}
geoip {
source => "src_ip"
target => "geoip"
}
}
}
}
However, we still get messages that should be processed and tagged correctly to have the tag "_grokparsefailure". We have tried the different grok expressions in various online debuggers and the built-in kibana debugger which all shows that the matches should work.
Can anyone help in identifying the issue here?