Palo Alto Logs Not Parsing Properly with panw Module in Filebeat

Hi Team,

I am using Palo Alto VM version 11.0.1 and forwarding syslogs to Elasticsearch through Filebeat using the panw module. While I can see the logs in Kibana, they are not being parsed properly. All the traffic logs are appearing in the event.original field, and no other fields are being populated. Here's an example log from the event.original field:

< <14>1 2024-12-01T11:20:03+06:00 PA-VM-Unit-1 - - - - netbios-dg 192.168.10.117 138 192.168.10.255 138 allow for log/>

Here’s my panw.yml configuration:

<

  • module: panw
    panos:
    enabled: true
    var.input: syslog
    panos:
    var.syslog_host: 0.0.0.0
    var.syslog_port: 9001
    />

What I Have Done:

  1. Verified that the logs are reaching Filebeat and Elasticsearch.
  2. Checked that the panw module is enabled and configured for syslog input.
  3. Observed that all logs are in event.original without being parsed.

Questions:

  1. Is there anything I am missing in the configuration?
  2. Do I need to adjust the log format on the Palo Alto side for the panw module to parse the logs properly?
  3. Are there additional steps required to ensure compatibility with Palo Alto VM version 11.0.1?

Any guidance would be greatly appreciated!

Thanks in advance.

What version of the stack?

Did you run setup BEFORE starting filebeat / sending logs? Otherwise it will not work

filebeat setup -e

Do have logstash in the ingest flow?

I don't think that the panw module will parse this kind of logs.

It is built to parse some firewall messages, not messages from the virtualized firewall VM.

It expect a csv message.

I think you will need to configure your device to send the traffic logs to one port and the vm logs to a generic syslog input.

1 Like

Hi,
Elasticsearch & Kibana version 8.16.0, Filebeat version 8.16.1 . I have run filebeat setup -e command. I am getting Fortinet & Sophos parsed logs using the module but not for Paloalto.
I did not use Logstash.

Hi,
I have Fortinet, Sophos, Paloalto VM. I am forwarding log to specific port. I have three different fiIebeat server for each firewalls log. I am getting Fortinet & Sophos parsed logs using module but not for Paloalto.

The example message you shared are not exactliy Palo Alto logs, they are logs for the VM that virtualizes your Firewall.

The logs that the module will parse are the Traffic logs, which are in a CSV format, what you shared is a syslog message from the VM, which will not be parsed by this module.

Are you getting any traffic or threat logs? If not, then your configuration on Palo Alto side is not correctly yet.

I am sharing how I forward log from my VM to a server.

  1. Device>server profile>syslog server
    here, I am sending UDP at port 514 in BSD/IETF format

  2. Object>log forwarding
    here, I am sending traffic,url,threat,auth,data,decryption,tunnel,wildfire log on that syslog server

  3. In Policy I have added the log forwarding profile in security policy's log forwarding option

Is this process ok? Or you can suggest me the process.

You need to configure this to send to your Filebeat on the port you configured, whic seems to be 9001.

Is this the configuration?

I changed the listening port from 514 to 9001, but the output remains the same. It appears that when Palo Alto forwards logs to the Filebeat server, the original log is being stored as a single field named event.original.

As a result, the panw module in Elasticsearch is parsing the entire log message into one field instead of breaking it down into multiple fields like IP, Port, Hostname, Application, etc. In Kibana, I see the complete log message in a single column instead of a structured format.
Any advice on resolving this and ensuring proper parsing would be appreciated.

please show an entire document json from discover please.