FortiGate Firewall

Hi All,
I am trying to parse the FortiGate firewall syslog in Logstash and still failing after spending many times.

Need your expertise for standard FortiGate syslog logstash config.

Here is current config. I'm getting the logs but all have _grokparsefailure error. I am seeing whole "message" full of long output.

I would like to retrieve, dstip, srcip, srcport, dstport, geoip, etc.
Thanks in advance.

input {
udp {
port => 5514
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

Hi @lcguy,
This is working example of fortigate firewall config for logstash you can change with your environment if you have x-pack then leave as it is otherwise comment in output. I hope this helps to you.

input {
udp {
port => 7000
type => "forti_log"
tags => ["location_a"]
}
}

filter {
#The Fortigate syslog contains a type field as well, we'll need to rename that field in order for this to work
if [type] == "forti_log" {

grok {
		match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
		overwrite => [ "message" ]
		tag_on_failure => [ "forti_grok_failure" ]
	}


    kv {
source => "message"
value_split => "="

#Expects you have csv enable set on your Fortigate. If not I think you'll have to change it to " " but I didn't test that.
field_split => ","
}

mutate {

#I want to use the timestamp inside the logs instead of Logstash's timestamp so we'll first create a new field containing the date and time fields from the syslog before we convert that to the @timestamp field
add_field => { "temp_time" => "%{date} %{time}" }
#The syslog contains a type field which messes with the Logstash type field so we have to rename it.
rename => { "type" => "ftg_type" }
rename => { "subtype" => "ftg_subtype" }
add_field => { "type" => "forti_log" }
convert => { "rcvdbyte" => "integer" }
convert => { "sentbyte" => "integer" }
}

date {
match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC"
target => "@timestamp"
}

mutate {

#add/remove fields as you see fit.
remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
}
}
}

output {
stdout { codec => rubydebug }
if [type] == "forti_log" {
elasticsearch {
hosts => "localhost:9200"
http_compression => "true"
index => "forti-%{+YYYY.MM.dd}"
user => "elastic"
password => "elastic"
template => "/usr/share/logstash/bin/forti.json"
template_name => "forti-*"
}
}
}

Thanks, Krunal.
I tried this logstash config but failed with only few message come out in Kibana.

Some questions allow me to ask;

  • I didn't install Xpack. Do I need? I want to capture FortiGate firewall log. Fortigate will send syslog to CentOS Logstash via udp 5514.
  • I also didn't see template name forti.json in "/usr/share/logstash/bin/forti.json". Where I can get this file?

Thanks.

So you can just remove those things that you dot need try this one this is without x-pack and no any template are used.

input {
udp {
port => 5514
type => "forti_log"
}
}

filter {
if [type] == "forti_log" {

grok {
		match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
		overwrite => [ "message" ]
		tag_on_failure => [ "forti_grok_failure" ]
	}


    kv {
source => "message"
value_split => "="
field_split => ","

}

mutate {
add_field => { "temp_time" => "%{date} %{time}" }
rename => { "type" => "ftg_type" }
rename => { "subtype" => "ftg_subtype" }
add_field => { "type" => "forti_log" }
convert => { "rcvdbyte" => "integer" }
convert => { "sentbyte" => "integer" }

}

date {
match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC"
target => "@timestamp"
}

mutate {

#add/remove fields as you see fit.
remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
}
}
}

output {
stdout { codec => rubydebug }
if [type] == "forti_log" {
elasticsearch {
hosts => "localhost:9200"
http_compression => "true"
index => "forti-%{+YYYY.MM.dd}"
}
}
}

Note:
In fortigate firewall you have enable the csv forwarding if its not then do that first and then run this config.

Otherwise you can this following config:

input
{
udp
{
port => 5514
type => "syslog"
}
}

filter
{
mutate
{
gsub =>
["message", ": ", ":",
"message", "^<[0-9][0-9][0-9]>", ""]
}
kv
{
field_split => ""," "
source => "message"
add_field => ["sourcetime", "%{date}:%{time}"]
}
date { match => ["sourcetime","yyyy-MM-dd:HH:mm:ss"]}
}

output
{
stdout { codec => rubydebug }
elasticsearch
{
host => ["localhost:9200"]
manage_template => "false"
index => "fortigate1-%{YYYY.mm.dd}"
}
}

Thanks & Regards,
Krunal.

1 Like

Thanks, Krunal. It seems like something wrong with config. I think syntax or white space in somewhere?
Logstash can't load.

Mine is no CSV export enabled. Only using pure Syslog.

Here is current config. I'm getting the logs but all have
_grokparsefailure error. I am seeing whole "message" full of long
output.

No need to reinvent the wheel, use the SYSLOGLINE pattern from
https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/linux-syslog

I would like to retrieve, dstip, srcip, srcport, dstport, geoip, etc.

I'd use the kv filter
with include_keys.

Wow...Simmel.
I am very new to programming. No idea where to put these lines.
Can give me some simple config and I can play later?

Thanks.

if you are collecting without csv syslog then last conf file working fine for me with csv also its working fine for me.

Do one thing create one normal conf file only input and output and then you will add this filter in last.

filter
{
mutate
{
gsub =>
["message", ": ", ":",
"message", "^<[0-9][0-9][0-9]>", ""]
}
kv
{
field_split => ""," "
source => "message"
add_field => ["sourcetime", "%{date}:%{time}"]
}
date { match => ["sourcetime","yyyy-MM-dd:HH:mm:ss"]}
}

first you create normal input and output file and run and see that logs are coming in what format then you add above filter. it;s may be worked for you.

Thanks & Regards,
Krunal.

1 Like

Works with following config now. Seems like error in kv. After I comment out kv, all works.
Do I need to install something to work with kv filter?

input {
udp {
port => 5514
type => "syslog"
}
}

filter {
mutate {
gsub =>
["message", ": ", ":",
"message", "^<[0-9][0-9][0-9]>", ""]
}

#kv {

field_split => ""," "

source => "message"

add_field => ["sourcetime", "%{date}:%{time}"]

}

#date { match => ["sourcetime","yyyy-MM-dd:HH:mm:ss"]}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

ohhh Great!!

now what is your output all the information you want that you are receiving or not ?

or still something is missing ?

Thanks & Regards,
Krunal.

Yes, it works now for FortiGate firewall.
I have tweak a bit. I copied some from other people's posts.

Thanks a lot, Krunal.

input {
udp {
port => 5514
type => "syslog"
}
}

filter {
mutate {
gsub =>
["message", ": ", ":",
"message", "^<[0-9][0-9][0-9]>", ""]
}

kv { }

if [msg] {
mutate {
replace => [ "message", "%{msg}" ]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.