Cisco ASA syslog parsing

I new to ELK, I am trying to parse cisco ASA syslogs. I am using default patterns of Cisco logs from this plugin from here

I also see proper output in logstash

{
"src_interface" => "outside",
"src_port" => "47148",
"@timestamp" => 2019-03-29T04:00:34.368Z,
"protocol" => "UDP",
"type" => "syslog",
"@version" => "1",
"host" => "60.1.1.1",
"reason" => "Failed to locate egress interface",
"dst_ip" => "202.12.27.33",
"dst_port" => "53",
"message" => "<166>Mar 29 2019 15:00:33 ciscoasa : %ASA-6-110002: Failed to locate egress interface for UDP from outside:50.1.1.2/47148 to 202.12.27.33/53\n",
"src_ip" => "50.1.1.2"
}

But I keep getting this error

[WARN ] 2019-03-29 15:01:13.757 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-asalogs-2019.03.29", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x2a023fa3], :response=>{"index"=>{"_index"=>"logstash-asalogs-2019.03.29", "_type"=>"doc", "_id"=>"Zhybx2kB4sJ2xTwf2iYM", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [logstash-asalogs-2019.03.29] as the final mapping would have more than 1 type: [cisco-fw, doc]"}}}}

I know the reason of this error message, The type is removed from 6.0 and I am using 6.6. But I am not sure how to solve this issue.

Here is my logstash config

input {
udp {
port => 1514
type => syslog
}
}

filter {
date {
match => ["timestamp",
"MMM dd HH:mm:ss",
"MMM d HH:mm:ss",
"MMM dd yyyy HH:mm:ss",
"MMM d yyyy HH:mm:ss"
]
timezone => "America/New_York"
}
if "_grokparsefailure" not in [tags] {
mutate {
rename => ["cisco_message", "message"]
remove_field => ["timestamp"]
}
}

syslog_pri { }

grok {
  match => [
    "message", "%{CISCOFW106001}",
    "message", "%{CISCOFW106006_106007_106010}",
    "message", "%{CISCOFW106014}",
    "message", "%{CISCOFW106015}",
    "message", "%{CISCOFW106021}",
    "message", "%{CISCOFW106023}",
    "message", "%{CISCOFW106100}",
    "message", "%{CISCOFW110002}",
    "message", "%{CISCOFW302010}",
    "message", "%{CISCOFW302013_302014_302015_302016}",
    "message", "%{CISCOFW302020_302021}",
    "message", "%{CISCOFW305011}",
    "message", "%{CISCOFW313001_313004_313008}",
    "message", "%{CISCOFW313005}",
    "message", "%{CISCOFW402117}",
    "message", "%{CISCOFW402119}",
    "message", "%{CISCOFW419001}",
    "message", "%{CISCOFW419002}",
    "message", "%{CISCOFW500004}",
    "message", "%{CISCOFW602303_602304}",
    "message", "%{CISCOFW710001_710002_710003_710005_710006}",
    "message", "%{CISCOFW713172}",
    "message", "%{CISCOFW733100}"
  ]
}

}

output {

stdout {
codec => rubydebug
}

elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-asalogs-%{+YYYY.MM.dd}"
}
}

Appreciate any help on this issue.

The log output contains a small clue. It seems you are trying to create a second type in an index. It seems there is already a type cisco-fw and the above configuration uses doc as the default type. You can try to add document_type => cisco-fw in the elasticsearch output.

For more information, see Elasticsearch output plugin | Logstash Reference [6.6] | Elastic (please take the time to read, as this is important for future upgrades)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.