Filebeat: Fortinet Module not processing data as it should

It's still not working properly :confused:
The best I have is working directly with Logstash and the following conf (that I took from this topic):

input {
	udp {
		port => 55514
		type => "forti_log"
		tags => ["FortiGateFW"]
	}
}

filter {
	#The Fortigate syslog contains a type field as well, we'll need to rename that field in order for this to work
	if [type] == "forti_log" {

		grok {
			match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
			overwrite => [ "message" ]
			tag_on_failure => [ "forti_grok_failure" ]
		}

    	kv {
			source => "message"
			value_split => "="
			#Expects you have csv enable set on your Fortigate. If not I think you'll have to change it to " " but I didn't test that.
			field_split => ","
		}

		mutate {
			#I want to use the timestamp inside the logs instead of Logstash's timestamp so we'll first create a new field containing the date and time fields from the syslog before we convert that to the @timestamp field
			add_field => { "temp_time" => "%{date} %{time}" }
			#The syslog contains a type field which messes with the Logstash type field so we have to rename it.
			rename => { "type" => "ftg_type" }
			rename => { "subtype" => "ftg_subtype" }
			add_field => { "type" => "forti_log" }
			convert => { "rcvdbyte" => "integer" }
			convert => { "sentbyte" => "integer" }
		}

		date {
			match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
			timezone => "UTC"
			target => "@timestamp"
		}

		mutate {
			#add/remove fields as you see fit.
			remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
		}
	}
}

output {
	stdout { codec => rubydebug }
	if [type] == "forti_log" {
		elasticsearch {
			hosts => ["https://xxx"]
			http_compression => "true"
			index => "forti-%{+YYYY.MM.dd}"
			user => "logstash_writer"
			password => "xxx"
			cacert => "/etc/logstash/certs/ca.crt"
		}
	}
}

But when a field has a comma in the middle, it breaks erroneously. As you can see in the image below:
logstash

That way my dashboards don't have 100% accurate information. But it is the best I have achieved so far...

Can you tell me the difference between using the Filebeat + Fortinet module and using the Elastic Agent with the Fortinet integration?

I tried to use the Elastic Agent, but it gave some errors and I ended up giving up. From what I read it’s just a way to centralize the beat management. So I don't think it would make much difference to the Filebeat

If I have some way to solve this problem (breaking the fields with a comma where I shouldn't have) I will continue to parse directly with the logstash...