Hey Guys,
I am a noob when it comes to ELK but am really eager to get this set up. I am currently using ELK to store syslog from multiple firewalls. I am using a fortinet (which is seeming to be not that fun to work with). I am having all of the syslog from the Fortigate go to port 514, and attempting to have logstash parse the logs. I know the following thus far:
- I am able to receive syslog on the Ubuntu instance on the server.
- Kibana is successfully receiving logs from beats and able to parse them with a logstash parser that I set up (followed a tutorial video on youtube).
- I cobbled together the franken-code below. Please let me know where you guys think I should look to go next. I have been playing around with configuration file for way too long... This is all in one file (which may be the problem?).
logstash conf file
input {
File {
path => "/var/log/syslog"
type => "syslog"
start_position => "beginning"
}
udp {
port => 514
type => "fortigate"
}
tcp {
port => 514
type => "fortigate"
}
}
Configure syslog filtering
for the Fortigate firewall logs
filter {
if [type] == "fortigate" {
mutate {
add_tag => ["fortigate"]
}
grok {
match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:mes$
overwrite => [ "message" ]
tag_on_failure => [ "failure_grok_fortigate" ]
}
kv { }
if [msg] {
mutate {
replace => [ "message", "%{msg}" ]
}
}
mutate {
add_field => ["logTimestamp", "%{date} %{time}"]
add_field => ["loglevel", "%{level}"]
replace => [ "fortigate_type", "%{type}"]
replace => [ "fortigate_subtype", "%{subtype}"]
remove_field => [ "msg","type", "level", "date", "time" ]
}
date {
locale => "en"
match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
remove_field => ["logTimestamp", "year", "month", "day", "time", "d$
add_field => ["type", "fortigate"]
}
}#end if type fortigate
}
output {
if ( [type] == "fortigate" ) {
#stdout { codec => rubydebug }
elasticsearch {
index => "logstash_fortigate-%{+YYYY.MM.dd}"
host => ["localhost:9200"]
protocol => "http"
port => "443"
}
}
}