On the remote client
filter {
if [type] == "fortilog" {
mutate {
gsub => [
"message", "[\=]", ":"
]
}
grok {
match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
overwrite => [ "message" ]
tag_on_failure => [ "failure_grok_fortigate" ]
}
mutate {
add_field => { "location_field" => "Location_A" }
gsub => [
"date", "[\"]", ""
]
}
}
}
The second gsub doesn't appear to be doing anything.
The above results in the below file (.txt) output on the remote client.
{"@timestamp":"2017-11-10T00:46:12.687Z","syslog_index":"<188>","syslog5424_pri":"188","@version":"1","host":"10.0.0.111","message":"date:2017-11-10,
time:00:46:09,type:traffic,subtype:other,msg:\"iprope_in_check() check failed, drop\"","type":"fortilog","location_field":"Location_A"}
The above output causes several problems when I try to ingest the file on the server side logstash.
-
@timestamp becomes {@timestamp. Location_A becomes Location_A}
- In the message field, date is shown as "date which causes issues later on.
- The Fortigate logs include a type field. As logstash is adding a type field as well that is causing issues. I tried renaming the type field to log_type if the value is not fortilog (the logstash type value) but I can't get it to work.
Filter on the logstash server
filter {
mutate {
gsub => [
"message", "[\\"\}]", ""
]
}
kv {
value_split => ":"
field_split => ","
}
}
output {
stdout {
codec => rubydebug
}
}
Results in
"msg" => "iprope_in_check() check failed",
"message" => "date:2017-11-10",
"type" => [
[0] "traffic",
[1] "fortilog"
],
"path" => "/home/test/Desktop/test/test1.txt",
"@timestamp" => 2017-11-10T00:57:01.483Z,
"syslog_index" => "188",
"subtype" => "other",
"{@timestamp" => "2017-11-10T00:46:12.687Z",
"syslog5424_pri" => "188",
"@version" => "1",
"host" => "10.0.0.111",
"time" => "00:46:09",
"location_field" => "Location_A"
The "date is probably breaking the date field and the { I cannot remove by adding { to the gsub filter.