Trying to use custom patterns without success

I have an ELK stack setup on Ubuntu 16.04 all using version 5.2. Everything is working fine. I am using the below 05-ciscoasa-filter.conf file in /etc/logstash/conf.d to sort logs from a Cisco ASA that are being dumped to a syslog server and shipped to logstash via filebeat. That is working fine.

filter {
  if [type] == "cisco-asa" {
    grok {
      match => [
        "message", "%{CISCOFW106001}",
        "message", "%{CISCOFW106006_106007_106010}",
        "message", "%{CISCOFW106014}",
        "message", "%{CISCOFW106015}",
        "message", "%{CISCOFW106021}",
        "message", "%{CISCOFW106023}",
        "message", "%{CISCOFW106100}",
        "message", "%{CISCOFW110002}",
        "message", "%{CISCOFW302010}",
        "message", "%{CISCOFW302013_302014_302015_302016}",
        "message", "%{CISCOFW302020_302021}",
        "message", "%{CISCOFW305011}",
        "message", "%{CISCOFW313001_313004_313008}",
        "message", "%{CISCOFW313005}",
        "message", "%{CISCOFW402117}",
        "message", "%{CISCOFW402119}",
        "message", "%{CISCOFW419001}",
        "message", "%{CISCOFW419002}",
        "message", "%{CISCOFW500004}",
        "message", "%{CISCOFW602303_602304}",
        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
        "message", "%{CISCOFW713172}",
        "message", "%{CISCOFW733100}"
      ]
    }
  }
  syslog_pri {
  }
  geoip {
    source => "src_ip"
    target => "geoip"
    database => "/data/geoipdb/GeoLite2-City.mmdb"
    add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
    add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
  }
  mutate {
    convert => [ "[geoip][coordinates]", "float"]
  }
}

I need to add additional patterns to match Cisco ASA log types that are not covered with the core patterns. I setup a patterns file with the below info.

#== Additional Cisco ASA patterns ==#

# Common Particles


# ASA-5-713050
CISCOFW713050 Group = %{GREEDYDATA:vpngroup}, Username = %{GREEDYDATA:vpnuser}, IP = %{IP:src_ip}, %{GREEDYDATA:vpnmessage}  Reason: %{GREEDYDATA:vpnreason}  Remote Proxy %{GREEDYDATA:vpnremoteprox}, Local Proxy %{GREEDYDATA:vpnlocalprox}

The above patterns file should cover the below example log entry which has been edited to remove some details.

Feb 14 15:01:03 x.x.x.x %ASA-5-713050: Group = text, Username = username, IP = x.x.x.x, Connection terminated for peer username. Reason: Peer Terminate Remote Proxy x.x.x.x, Local Proxy x.x.x.x

I placed the patterns file in /opt/logstash/patterns/ciscoasa-logstash-patterns. Permissions allow the logstash user and group on the complete path with 644 on the file. I then added the below two commented lines in the 05-ciscoasa-filter.conf configuration listed above.

filter {
  if [type] == "cisco-asa" {
grok {
  patterns_dir => ["/opt/logstash/patterns"]  #added
  match => [
    "message", "%{CISCOFW106001}",

Break

    "message", "%{CISCOFW733100}",
    # Additional patterns not included in the core
    "message", "%{CISCOFW713050}" 

When logs of that type were sent through the ELK server, they were listed with a "_grokparsefailure". I used an online grok parsing tester and it showed no errors in my syntax for the patterns file.

What am I doing wrong? Should I not be using a patterns file, I'm not sure how to tell if it even loads properly. I also tried just adding the pattern directly in the 05-ciscoasa-filter.conf file, but that would bomb every time I tried to restart logstash.

I would like to get to a point where I can successfully add a new entry and pattern for each log file that is not defined in the logstash core. Ideally to duplicate for other log types as well.

Any help would be greatly appreciated.

In LS config, have you tried to use only your custom pattern "message", "%{CISCOFW713050}"? Just remove all other default patterns to make sure LS works properly first.

If you raise the log level high enough, Logstash will log information about all loaded grok patterns. Is your pattern in that list? If yes, simplify your pattern until it matches. Then start specializing it again, paying close attention to when it breaks.

I raised the log level of logstash this morning, setup my original configuration with the pattern file and tested it. The logs showed no error on the log entry and when I looked in Kibana, it worked. I guess I just needed to give it 24 hours of rest. Either that, or I was missing something the first time I tested it. In any event thanks for your help on this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.