I am new to elastic, so any help would be appreciated. I have a gotten ELK working and a basic WinloginBeat to work from a DC. Trying to get Fortigate traffic into Elastic.
I am running FortiOS v6.2.1 build0932 (GA)
My Syslog settings on the Fortigate look like this:
FGT60D461xxxxxxxx (setting) # get
status : enable
server : 192.168.10.xxx
mode : udp
port : 514
facility : local0
source-ip : 192.168.10.xxx
format : default
Config file built here:
/etc/logstash/conf.d/Forti60E.conf
input {
udp {
port => 514 #whatever port you have define change this
type => "forti_log"
}
}
filter {
if [type] == "forti_log" {
grok {
match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
overwrite => [ "message" ]
tag_on_failure => [ "forti_grok_failure" ]
}
kv {
source => "message"
value_split => "="
field_split => ","
}
mutate {
add_field => { "temp_time" => "%{date} %{time}" }
rename => { "type" => "ftg_type" }
rename => { "subtype" => "ftg_subtype" }
add_field => { "type" => "forti_log" }
convert => { "rcvdbyte" => "integer" }
convert => { "sentbyte" => "integer" }
}
date {
match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC" #change with your timestamp
target => "@timestamp"
}
mutate {
remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
}
}
}
output {
stdout { codec => rubydebug }
if [type] == "forti_log" {
elasticsearch {
hosts => "192.168.10.212:9200" #change with your elastic ip
http_compression => "true"
index => "forti-%{+YYYY.MM.dd}"
user => "elastic"
password => "elastic"
template => "/usr/share/logstash/bin/forti.json"
template_name => "forti-*"
}
}
}
Then I have this file here:
/usr/share/logstash/bin/forti.json
{
"template" : "forti- ",
"version" : 50001,
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
" default " : {
"_all" : {"enabled" : true, "omit_norms" : false},
"dynamic_templates" : [ {
"message_field" : {
"path_match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text",
"omit_norms" : false
}
}
}, {
"string_fields" : {
"match" : " ",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text", "omit_norms" : false,
"fields" : {
"keyword" : { "type": "keyword", "ignore_above": 256 }
}
}
}
} ],
"properties" : {
"@timestamp": { "type": "date", "include_in_all": false },
"@version": { "type": "keyword", "include_in_all": false },
"geoip" : {
"dynamic": true,
"properties" : {
"ip": { "type": "ip" },
"location" : { "type" : "geo_point" },
"latitude" : { "type" : "half_float" },
"longitude" : { "type" : "half_float" }
}
},
"location": { "type": "geo_point" }
}
}
}
}
Then ran:
sudo systemctl stop logstash.service
sudo systemctl Start logstash.service
Not getting anything in Kibana. Where should I be looking to errors/ logs?