Logstash CISCO ASA logs not parsing

I have log file which has entries like this

2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302014: Teardown TCP connection 597505728 for outside:40.97.142.8/443 to inside:192.168.5.120/50041 duration 0:00:00 bytes 0 TCP Reset-O
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-305011: Built dynamic TCP translation from inside:192.168.5.120/50053 to outside:63.149.247.7/50053
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302013: Built outbound TCP connection 597505730 for outside:40.97.134.216/443 (40.97.134.216/443) to inside:192.168.5.120/50053 (63.149.247.7/50053)
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302014: Teardown TCP connection 597505730 for outside:40.97.134.216/443 to inside:192.168.5.120/50053 duration 0:00:00 bytes 0 TCP Reset-O
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302014: Teardown TCP connection 597505432 for outside:40.97.142.2/443 to inside:192.168.11.82/63077 duration 0:00:04 bytes 5526 TCP FINs
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302014: Teardown TCP connection 597503831 for outside:104.80.89.35/80 to inside:192.168.11.65/58618 duration 0:00:41 bytes 23403 TCP FINs
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-305011: Built dynamic TCP translation from inside:192.168.5.120/50055 to outside:63.149.247.7/54655
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-302013: Built outbound TCP connection 597505731 for outside:52.90.127.253/80 (52.90.127.253/80) to inside:192.168.5.120/50055 (63.149.247.7/54655)
2018-07-30 09:50:30||20||6||172.16.1.4||%ASA-6-305012: Teardown dynamic TCP translation from inside:192.168.11.65/35431 to outside:63.149.247.7/35431 duration 0:10:33

I am trying to parse it and then insert to elasticsearch. I am not getting any parsing result. I am not able to understand whats wrong with the code.

input {
  file {
    path => "/Users/samvidkulkarni/Desktop/SOC_RAW_FILES/report_2018-07-30-10-00-01/etc/soc/reports/syslog/report.txt"
    type => "syslog"
    start_position => beginning
	ignore_older => 0

  }
  }
filter {
  if [type == "syslog"] {
    # Split the syslog part and Cisco tag out of the message
    grok {
      match => ["message", "%{TIMESTAMP_ISO8601:@timestamp} || %{NUMBER:syslog_cat} || %{NUMBER:syslog_severity} || %{IP:client} || %%{CISCOTAG:ciscotag}: %{GREEDYDATA:cisco_message}"]
    }

    # Parse the syslog severity and facility
    syslog_pri { }

    # Parse the date from the "timestamp" field to the "@timestamp" field
    date {
      match => ["@timestamp",
        "yyyy-MM-dd HH:mm:ss"
      ]
      timezone => "America/Los_Angeles"
    }

    # Clean up redundant fields if parsing was successful
    if "_grokparsefailure" not in [tags] {
      mutate {
        rename => ["cisco_message", "message"]
        remove_field => ["timestamp"]
      }
    }

    # Extract fields from the each of the detailed message types
    # The patterns provided below are included in Logstash since 1.2.0
    grok {
      match => [
        "message", "%{CISCOFW106001}",
        "message", "%{CISCOFW106006_106007_106010}",
        "message", "%{CISCOFW106014}",
        "message", "%{CISCOFW106015}",
        "message", "%{CISCOFW106021}",
        "message", "%{CISCOFW106023}",
        "message", "%{CISCOFW106100}",
        "message", "%{CISCOFW110002}",
        "message", "%{CISCOFW302010}",
        "message", "%{CISCOFW302013_302014_302015_302016}",
        "message", "%{CISCOFW302020_302021}",
        "message", "%{CISCOFW305011}",
        "message", "%{CISCOFW313001_313004_313008}",
        "message", "%{CISCOFW313005}",
        "message", "%{CISCOFW402117}",
        "message", "%{CISCOFW402119}",
        "message", "%{CISCOFW419001}",
        "message", "%{CISCOFW419002}",
        "message", "%{CISCOFW500004}",
        "message", "%{CISCOFW602303_602304}",
        "message", "%{CISCOFW710001_710002_710003_710005_710006}",
        "message", "%{CISCOFW713172}",
        "message", "%{CISCOFW733100}"
      ]
    }
  }
}
output {
  elasticsearch { hosts => ["localhost:9200"] }
  stdout { codec => rubydebug }
}

This is my output on the kibana. It is not generating any errors but it is not extracting any information either.

"path" => "/Users/samvidkulkarni/Desktop/SOC_RAW_FILES/report_2018-07-30-10-00-01/etc/soc/reports/syslog/report.txt",
"host" => "Miltons-MacBook-Air.local",
"@version" => "1",
"@timestamp" => 2018-08-09T17:32:58.940Z,
"message" => "2018-07-30 09:55:29||20||6||172.16.1.4||%ASA-6-302013: Built outbound TCP connection 597520091 for outside:52.90.127.252/80 (52.90.127.252/80) to inside:192.168.11.186/58727 (63.149.247.7/58727)",
"type" => "syslog"

Your patterns have spaces in them but the messages do not. Also, | is used for alternation in a regexp, so it needs to be escaped if you want to match it in the message. Also, you cannot grok to @timestamp and then use a date filter on it. Try this instead

grok {
  match => ["message", "%{TIMESTAMP_ISO8601:timestamp}\|\|%{NUMBER:syslog_cat}\|\|%{NUMBER:syslog_severity}\|\|%{IP:client}\|\|%%{CISCOTAG:ciscotag}: %{GREEDYDATA:cisco_message}"]
}

Thank you very much for responding. I did all the changes you said and it is working now but I am still getting _dateparsefailure error.

I did make few changes to date as shown below

filter {
  #if [type == "syslog"] {
    # Split the syslog part and Cisco tag out of the message
    grok {
      match => ["message", "%{TIMESTAMP_ISO8601:timestamp}\|\|%{NUMBER:syslog_cat}\|\|%{NUMBER:syslog_severity}\|\|%{IP:client}\|\|%%{CISCOTAG:ciscotag}: %{GREEDYDATA:cisco_message}"]
    }

    # Parse the syslog severity and facility
    syslog_pri { }

    # Parse the date from the "timestamp" field to the "@timestamp" field
    date {
      match => ["timestamp",
        "yyyy-MM-dd HH:mm:ss.SSS"
      ]
      timezone => "UTC"
    }

Your timestamps do not have milliseconds. Try "yyyy-MM-dd HH:mm:ss"

Thank you very much for your help. Now it is working perfectly. Thank you Sir.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.