Logstash Patterns - Firewalls - Best Place to Find it

My use case is Cisco ASA firewall logs but I think these questions apply more broadly.

I am trying do the parsing of Cisco ASA logs in logstash, not using Filebeat.

  1. I'd like to not reinvent the wheel so where can I find the Filebeat Cisco module's code that does this parsing, so that I can use that code in logstash parsing?

  2. There is a logstash-patterns-core/patterns/firewall file. (logstash-patterns-core) But it doesn't map fields to ECS field names. For example it uses src_ip instead of source.address. Why on earth would elastic put out this file and not use ECS field names?

  3. Related to #1, I've looked and looked and looked. Is there an updated github page for mapping Cisco ASA to ECS fields? I would expect it to be the one I linked to already, but using custom, non-ECS fields is a non-starter.

Hi,

Have a look here: https://github.com/elastic/beats/tree/master/x-pack/filebeat/module/cisco

In particular, here is the ingest pipeline for ElasticSearch including the grok patterns: https://github.com/elastic/beats/blob/master/x-pack/filebeat/module/cisco/shared/ingest/asa-ftd-pipeline.yml

Best regards
Wolfram

So I looked at the filebeat modules and they are not like the logstash pipelines I am used to seeing. So if I took those pipelines from filebeat and put in as logstash pipelines, are they going to work as is?

For example the ASA one starts with

processors:

  • grok:
    field: message
    patterns:
    - "(?:%{SYSLOG_HEADER})?\s*%{GREEDYDATA:log.original}"
    pattern_definitions:
    SYSLOG_HEADER: "(?:%{SYSLOGFACILITY}\s*)?(?:%{FTD_DATE:temp.raw_date}:?\s+)?(?:%{PROCESS_HOST}|%{HOST_PROCESS})(?:{DATA})?%{SYSLOG_END}?"
    SYSLOGFACILITY: "<%{NONNEGINT:syslog.facility:int}(?:.%{NONNEGINT:syslog.priority:int})?>"
    # Beginning with version 6.3, Firepower Threat Defense provides the option to enable timestamp as per RFC 5424.
    FTD_DATE: "(?:%{TIMESTAMP_ISO8601}|%{ASA_DATE})"
    ASA_DATE: "(?:%{DAY} )?%{MONTH} *%{MONTHDAY}(?: %{YEAR})? %{TIME}(?: %{TZ})?"
    PROCESS: "(?:[^%\s:\+)"
    SYSLOG_END: "(?:(:|\s)\s+)"
    # exactly match the syntax for firepower management logs
    PROCESS_HOST: "(?:%{PROCESS:process.name}:\s%{SYSLOGHOST:host.name})"
    HOST_PROCESS: "(?:%{SYSLOGHOST:host.hostname}:?\s+)?(?:%{PROCESS:process.name}?(?:\[%{POSINT:process.pid:long}\])?)?"

Would i have to wrap that in a filter {} and is -grok even a valid syntax? I've always just used grok no dash. If you can give me a start of what might need changing to be useful in a logstash pipeline, I can go from there.

That is YAML, which is not a valid logstash configuration, but transforming it should be straightforward. I would try

filter {
    grok {
        pattern_definitions => {
            "SYSLOG_HEADER" => "(?:%{SYSLOGFACILITY}\s*)?(?:%{FTD_DATE:temp.raw_date}:?\s+)?(?:%{PROCESS_HOST}|%{HOST_PROCESS})(?:{DATA})?%{SYSLOG_END}?"
            "SYSLOGFACILITY" => "<%{NONNEGINT:syslog.facility:int}(?:.%{NONNEGINT:syslog.priority:int})?>"
            "FTD_DATE" => "(?:%{TIMESTAMP_ISO8601}|%{ASA_DATE})"
            "ASA_DATE" => "(?:%{DAY} )?%{MONTH} *%{MONTHDAY}(?: %{YEAR})? %{TIME}(?: %{TZ})?"
            "PROCESS" => "(?:[^%\s:\+)"
            "SYSLOG_END" => "(?:(:|\s)\s+)"
            "PROCESS_HOST" => "(?:%{PROCESS:process.name}:\s%{SYSLOGHOST:host.name})"
            "HOST_PROCESS" => "(?:%{SYSLOGHOST:host.hostname}:?\s+)?(?:%{PROCESS:process.name}?(?:\[%{POSINT:process.pid:long}\])?)?"
        }
        match => { "message" => "(?:%{SYSLOG_HEADER})?\s*%{GREEDYDATA:log.original}" }
    }
}
1 Like

Thank you. That's exactly the kind of help I needed. I'm just going to mark that as the solution even though I won't be able to test for a while.