Hello,
I want to export logs from my cisco router via logstash.
Here is the filter that I put in place and which brings out the logs correctly:
the filter
###########################
filter {
grok {
patterns_dir => [ "/etc/logstash/patterns" ]
match => [
"message",
"%{HOSTNAME:host.name}: %{CISCOTIMESTAMPTZ:log_date}: %%{CISCO_REASON:facility}-%{INT:severity_level}-%{CISCO_REASON:facility_mnemonic}: %{GREEDYDATA:message}"
]
overwrite => [ "message" ]
add_tag => [ "cisco" ]
remove_field => [ "@version" ]
}
if "cisco" in [tags] {
mutate {
gsub => [
"severity_level", "0", "0 - Emergency",
"severity_level", "1", "1 - Alert",
"severity_level", "2", "2 - Critical",
"severity_level", "3", "3 - Error",
"severity_level", "4", "4 - Warning",
"severity_level", "5", "5 - Notification",
"severity_level", "6", "6 - Information"
]
}
}
}
###########################
the output
###########################
{
"facility_mnemonic" => "BAD_PACK_LEN",
"facility" => "SSH",
"message" => "Bad packet length -440567870",
"severity_level" => "3 - Error",
"host.name" => "RBB",
"log_date" => "Sep 30 18:02:29.452",
"@timestamp" => 2019-10-02T09:26:40.930Z,
"host" => "LOGSTASH",
"tags" => [
[0] "cisco"
],
"type" => "syslog-cisco"
}
###########################
I have the following error when I export my logs:
[WARN ] 2019-10-02 12:06:52.184 [[main]>worker3] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"network-2019.10.02", :_type=>"syslog-cisco", :routing=>nil}, #<LogStash::Event:0x137742b0>], :response=>{"index"=>{"_index"=>"network-2019.10.02", "_type"=>"syslog-cisco", "_id"=>"c-7vi20BviaHAKaazhfz", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"mapper [host] of different type, current_type [keyword], merged_type [ObjectMapper]"}}}}
could you help me ?