Fortinet module not parsing from file


I want to parse fortinet logs from file instead of syslog network. There is an option var.paths in Fortinet module | Filebeat Reference [master] | Elastic and var.input . I configured it as below:

- module: fortinet

enabled: true

    - /home/vagrant/shared/fortigate.log
    # Set which input to use between tcp, udp (default) or file.
    var.input: file

And I am writing some log lines to fortigate.log file with a script however it is not parsing message part.
This is sample raw log I am appending as a new line to fortigate.log

date=2017-11-15 time=11:44:16 logid="0000000013" type="traffic" subtype="forward" level="notice" vd="vdom1" eventtime=1510775056 srcip= srcname="pc1" srcport=40772 srcintf="port12" srcintfrole="undefined" dstip= dstname="" dstport=443 dstintf="port11" dstintfrole="undefined" poluuid="707a0d88-c972-51e7-bbc7-4d421660557b" sessionid=8058 proto=6 action="close" policyid=1 policytype="policy" policymode="learn" service="HTTPS" dstcountry="United States" srccountry="Reserved" trandisp="snat" transip= transport=40772 appid=40568 app="HTTPS.BROWSER" appcat="Web.Client" apprisk="medium" duration=2 sentbyte=1850 rcvdbyte=39898 sentpkt=25 rcvdpkt=37 utmaction="allow" countapp=1 devtype="Linux PC" osname="Linux" mastersrcmac="a2:e9:00:ec:40:01" srcmac="a2:e9:00:ec:40:01" srcserver=0 utmref=0-220586

This is the output of filebeat:

  "@timestamp": "2021-05-20T21:53:12.537Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "7.12.1",
    "pipeline": "filebeat-7.12.1-fortinet-firewall-pipeline"
  "message": "date=2017-11-15 time=11:44:16 logid=\"0000000013\" type=\"traffic\" subtype=\"forward\" level=\"notice\" vd=\"vdom1\" eventtime=1510775056 srcip= srcname=\"pc1\" srcport=40772 srcintf=\"port12\" srcintfrole=\"undefined\" dstip= dstname=\"\" dstport=443 dstintf=\"port11\" dstintfrole=\"undefined\" poluuid=\"707a0d88-c972-51e7-bbc7-4d421660557b\" sessionid=8058 proto=6 action=\"close\" policyid=1 policytype=\"policy\" policymode=\"learn\" service=\"HTTPS\" dstcountry=\"United States\" srccountry=\"Reserved\" trandisp=\"snat\" transip= transport=40772 appid=40568 app=\"HTTPS.BROWSER\" appcat=\"Web.Client\" apprisk=\"medium\" duration=2 sentbyte=1850 rcvdbyte=39898 sentpkt=25 rcvdpkt=37 utmaction=\"allow\" countapp=1 devtype=\"Linux PC\" osname=\"Linux\" mastersrcmac=\"a2:e9:00:ec:40:01\" srcmac=\"a2:e9:00:ec:40:01\" srcserver=0 utmref=0-220586",
  "input": {
    "type": "log"
  "agent": {
    "version": "7.12.1",
    "hostname": "vagrant",
    "ephemeral_id": "f0221863-6941-46ec-9efc-b9255ee465fd",
    "id": "90241210-096b-4b18-8a3e-2d6ac8fbfe27",
    "name": "vagrant",
    "type": "filebeat"
  "log": {
    "offset": 49175,
    "file": {
      "path": "/home/vagrant/shared/fortigate.log"
  "tags": [
  "service": {
    "type": "fortinet"
  "event": {
    "module": "fortinet",
    "dataset": "fortinet.firewall"
  "fileset": {
    "name": "firewall"
  "ecs": {
    "version": "1.8.0"

How should I configure it to make fortinet module to parse it properly? And also note that I am not writing filebeat outputs to logstash or elasticsearch, configured a file as output and seeing same transformed json above in debug mode as well.

Can u post the logs from filebeat? Did u install the ingest pipelines?

Which logs did you mean, /var/log/filebeat/filebeat? For the second question if you mean 'filebeat setup --pipelines --modules fortinet' yes I ran it but remember as I stated in the first post I am not sending logs from filebeat to logstash or elasticsearch, I simpyl read from file and write to file using option below in filebeat.yml

  path: "/home/vagrant/shared/"
  filename: testout

Output for elasticsearch and logstash is disabled.

Ok I missed that. All the parsing and enrichment is happening in the ingest pipeline, not locally in filebeat?

I didn't understand, is not the parsing operation performed by filebeat itself? I created different pipelines like using gsuite for example with logstash and there was not problem, there was not any filter or remapping configuration on logstash everything was fine. Is there an internal communication between filebeat and logstash where the parsing is performed on logstash side and if it is so, is it same for all filebeat modules?

No some modules do the majority of the processing locally and some do all of it on the ingest pipeline side.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.