The best approach would be to use Elastic Agent integrations to receive the logs as it has integrations with pre-defined parsers for both Fortigate and Fortimanager.
Elastic Agent would listen for the logs in different ports.
FortiGate logs can be easily sent to Elasticsearch, but FortiManager logs have a different format, which causes field mapping issues. The best practice is to enable JSON output in FortiManager for clean and organized logs. If JSON is not available, use the CEF format instead.
Apply a JSON filter in Logstash and maintain consistent field names to create unified Kibana dashboards.
Regarding the Pass4Future FortiManager 7.6 Administrator study materials, I did not find any specific examples related to Elasticsearch or Logstash integration.
You can see it in the materials here: Pass4Future FortiManager 7.6 Administrator Exam Material.
You can check this section for verification. It is mostly focused on FortiManager's native log management, not on third-party setups.
I didn’t say that Pass4Future questions cover Elasticsearch or Logstash integration specifically.
What I meant was that the materials mention third-party integrations in general.
That’s why I connected it to the possibility of using Elastic tools as an example of external log management integration. you can verify from their Fortinet FCP_FMG_AD-7.6 Exam Questions.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.