recently I started a module for logs from fortigate firewalls.
- module: fortinet
firewall:
enabled: true
# Set which input to use between tcp, udp (default) or file.
#var.input: udp
# The interface to listen to syslog traffic. Defaults to
# localhost. Set to 0.0.0.0 to bind to all available interfaces.
#var.syslog_host: localhost
# The port to listen for syslog traffic. Defaults to 9004.
#var.syslog_port: 9004
var.input: "file"
var.paths: ["/var/log/syslog/filename.log"]
But when I check for documents with data I can see that all documents contains error message
It's because it's still expecting the syslog priority string, ex. <188>. See the link I posted above to the sample data that's used to test the module. How are u getting that different data in the front, we can always update the grok pattern but curious if this is an isolated incident or could be more widespread.
For you specifically, you can update the grok pattern for the ingest pipeline for the fortinet firewall, should be named something like filebeat-7.12.1-fortinet-firewall-pipeline to something like below
Are you using the default logging settings for your firewall? I see you are using the log file input, not the syslog input. Are you running Filbeat on the firewall or using another tool to put the logs into a file?
Logs are sent via syslog to my linux and save to file. Filebeat reads from this file. I am only the receiver of logs I do not have view into fortigate configuration.
Ok, that makes sense. Then whatever program is saving the syslog to the file is probably adding that data to the front. If you're able you could use the syslog input on Filebeat and have the firewall logs sent directly to Filebeat without the need for another tool nor storing them to a file. That should resolve the issue of having to modify the ingest pipeline. If not, then you should be good with the current modifications. Just know that when you upgrade the ingest pipeline will be upgraded too and you may need to make the changes again.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.