Using Fortinet Integration in Logstash

Hey everyone o/

I'm using that conf I found on this topic for parsing logs from FortigateFW with Logstash:

input {
udp {
port => 514
type => "forti_log"
tags => ["FortiGateFW"]
}
}
#THE BEGINNING DOESN'T FORMAT TO CODE :thinking: AND I DON'T KNOW WHY

    filter {
    #The Fortigate syslog contains a type field as well, we'll need to rename that field in order for this to work
    if [type] == "forti_log" {

    grok {
    		match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
    		overwrite => [ "message" ]
    		tag_on_failure => [ "forti_grok_failure" ]
    	}


        kv {
    source => "message"
    value_split => "="

    #Expects you have csv enable set on your Fortigate. If not I think you'll have to change it to " " but I didn't test that.
    field_split => ","
    }

    mutate {

    #I want to use the timestamp inside the logs instead of Logstash's timestamp so we'll first create a new field containing the date and time fields from the syslog before we convert that to the @timestamp field
    add_field => { "temp_time" => "%{date} %{time}" }
    #The syslog contains a type field which messes with the Logstash type field so we have to rename it.
    rename => { "type" => "ftg_type" }
    rename => { "subtype" => "ftg_subtype" }
    add_field => { "type" => "forti_log" }
    convert => { "rcvdbyte" => "integer" }
    convert => { "sentbyte" => "integer" }
    }

    date {
    match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
    timezone => "UTC"
    target => "@timestamp"
    }

    mutate {

    #add/remove fields as you see fit.
    remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
    }
    }
    }

    output {
    stdout { codec => rubydebug }
    if [type] == "forti_log" {
    elasticsearch {
    hosts => ["XXX.XXX.XXX.XXX:9200", "XXX.XXX.XXX.XXX:9200", "XXX.XXX.XXX.XXX:9200"]
    http_compression => "true"
    index => "forti-%{+YYYY.MM.dd}"
    #user => "elastic"
    #password => "elastic"
    #template => "/usr/share/logstash/bin/forti.json"
    #template_name => "forti-*"
    }
    }
    }

And at this moment I think all parsing is ok (at last with one fortigate fw. when I add a second on the same conf, some fields parse with errors, but it's for another topic, I think some config on 2nd fw logs is different) .

The point is:
Learning more about ELK I found yourkibanahost /app/fleet#/integrations/detail/fortinet-0.8.0/overview integration and I don't know how to use it with Logstash. I would like to test it for compare the results.
*I think all kibana UI show this integration, but you can view more about here

I need to convert like this doc explains?
Or something starting by this doc?

I have a lot to learn :roll_eyes: and sometimes I get confused on what to focus... The first mission on my log-server is to get powerfull insights with fortigate logs/dashboards so I'm trying to do my best on this :man_technologist:

Please edit your post, select the logstash configuration (only the logstash configuration) and click on </> in the tool bar above the edit pan. That will change the appearance from

udp {
port => 514
type => "forti_log"
tags => ["FortiGateFW"]
}

which I find unreadable to

udp {
  port => 514
  type => "forti_log"
  tags => ["[spoiler]FortiGateFW[/spoiler]"]
}

Also, the link you posted is to the local hostname mykibana, so nobody else can follow it.

1 Like

The integration you linked to is for filebeat, not logstash. You can have filebeat listen on a network port and tell your Fortinet device to send syslog messages to it. filebeat can send the results to elasticsearch. You may not need to use logstash at all. I suggest you set that up first. If you find you need additional processing then you may be able to do it with logstash.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.