Palo alto logs

Hello everybody,

Im trying to ingest data from PAN-OS Syslog Integration 8.1.10.
I read I can use filebeat + pawn module.
I read the https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-module-panw.html
to configure it. I gave the path and the input file on the pawn.yml, and also i enabled the module.

Then, I left filebeat in default configuration because Im running the instance in localhost. I just commented the filebeat inputs because I only want to focus on that palo alto logs.
I also run the command "filebeat setup -e" to index and install Kibana Dashboards.

The thing is when I go to Discover or I tryed to access to PAN-OS Flows [Filebeat PANW] ECS Dashboard, I dont see any data. Filebeat is being executing without problems, and I run it in debug mode and I dont see how the data is going.

¿Any idea what could be happening?
This is the filebeat log, actually I don´t see where could be the problem:

[root@localhost filebeat]# cat /var/log/filebeat/filebeat.4
2020-03-06T20:20:19.708+0100    INFO    instance/beat.go:622    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-03-06T20:20:19.724+0100    INFO    instance/beat.go:630    Beat ID: 7378502a-3d94-4710-99da-ccfc35ae4bcd
[root@localhost filebeat]#

Thank you very much in advance.

Best regards

Hi @david-vazquez,

could you please run the filebeat with these flags: -e -d processors ? You should see full output also for processors.

Hello @mtojek,

Thanks for answer me.

I runned the command filebeat -e -d run and this is the output:

 [root@localhost modules.d]# filebeat -e -d run
2020-03-09T20:43:12.697+0100    INFO    instance/beat.go:622    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-03-09T20:43:12.698+0100    INFO    instance/beat.go:630    Beat ID: 7378502a-3d94-4710-99da-ccfc35ae4bcd
2020-03-09T20:43:15.703+0100    INFO    add_cloud_metadata/add_cloud_metadata.go:89     add_cloud_metadata: hosting provider type not detected.
2020-03-09T20:43:15.703+0100    INFO    [seccomp]       seccomp/seccomp.go:124 Syscall filter successfully installed
2020-03-09T20:43:15.703+0100    INFO    [beat]  instance/beat.go:958    Beat info       {"system_info": {"beat": {"path": {"config": "/etc/filebeat", "data": "/var/lib/filebeat", "home": "/usr/share/filebeat", "logs": "/var/log/filebeat"}, "type": "filebeat", "uuid": "7378502a-3d94-4710-99da-ccfc35ae4bcd"}}}
2020-03-09T20:43:15.703+0100    INFO    [beat]  instance/beat.go:967    Build info      {"system_info": {"build": {"commit": "c1c49432bdc53563e63e9d684ca3e9843626e448", "libbeat": "7.6.1", "time": "2020-02-28T23:12:26.000Z", "version": "7.6.1"}}}
2020-03-09T20:43:15.703+0100    INFO    [beat]  instance/beat.go:970    Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":2,"version":"go1.13.8"}}}
2020-03-09T20:43:15.704+0100    INFO    [beat]  instance/beat.go:974    Host info       {"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-03-09T20:19:21+01:00","containerized":false,"name":"localhost.localdomain","ip":["127.0.0.1/8","::1/128","192.168.0.29/24","fe80::1bd3:953f:8943:cdaf/64","192.168.122.1/24"],"kernel_version":"3.10.0-1062.12.1.el7.x86_64","mac":["08:00:27:fd:5b:47","52:54:00:19:dd:b5","52:54:00:19:dd:b5"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":7,"patch":1908,"codename":"Core"},"timezone":"CET","timezone_offset_sec":3600,"id":"08d134ea87a146a0b7436492458816c1"}}}
2020-03-09T20:43:15.706+0100    INFO    [beat]  instance/beat.go:1003   Process info    {"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"ambient":null}, "cwd": "/etc/filebeat/modules.d", "exe": "/usr/share/filebeat/bin/filebeat", "name": "filebeat", "pid": 4610, "ppid": 3630, "seccomp": {"mode":"filter","no_new_privs":true}, "start_time": "2020-03-09T20:43:11.600+0100"}}}
2020-03-09T20:43:15.706+0100    INFO    instance/beat.go:298    Setup Beat: filebeat; Version: 7.6.1
2020-03-09T20:43:15.706+0100    INFO    [index-management]      idxmgmt/std.go:182      Set output.elasticsearch.index to 'filebeat-7.6.1' as ILM is enabled.
2020-03-09T20:43:15.706+0100    INFO    elasticsearch/client.go:174     Elasticsearch url: http://localhost:9200
2020-03-09T20:43:15.706+0100    INFO    [publisher]     pipeline/module.go:110 Beat name: localhost.localdomain
2020-03-09T20:43:15.708+0100    INFO    [monitoring]    log/log.go:118  Starting metrics logging every 30s
2020-03-09T20:43:15.708+0100    INFO    instance/beat.go:439    filebeat start running.
2020-03-09T20:43:15.991+0100    INFO    registrar/registrar.go:145      Loading registrar data from /var/lib/filebeat/registry/filebeat/data.json
2020-03-09T20:43:15.991+0100    INFO    registrar/registrar.go:152      States Loaded from registrar: 3
2020-03-09T20:43:15.991+0100    INFO    crawler/crawler.go:72   Loading Inputs: 1
2020-03-09T20:43:15.992+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303 /home/user/Descargas/attack_download.log /home/user/Descargas/umbrella.20200303]
2020-03-09T20:43:15.992+0100    INFO    input/input.go:114      Starting input of type: log; ID: 6160703129214275189
2020-03-09T20:43:15.993+0100    INFO    log/harvester.go:297    Harvester started for file: /home/user/Descargas/umbrella.20200303
2020-03-09T20:43:15.995+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303]
2020-03-09T20:43:15.995+0100    INFO    crawler/crawler.go:106  Loading and starting Inputs completed. Enabled inputs: 1
2020-03-09T20:43:15.995+0100    INFO    cfgfile/reload.go:171   Config reloader started
2020-03-09T20:43:16.005+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303]
2020-03-09T20:43:16.007+0100    INFO    elasticsearch/client.go:174     Elasticsearch url: http://localhost:9200
2020-03-09T20:43:16.051+0100    INFO    elasticsearch/client.go:757     Attempting to connect to Elasticsearch version 7.6.0
2020-03-09T20:43:16.079+0100    INFO    [license]       licenser/es_callback.go:50      Elasticsearch license: Basic
2020-03-09T20:43:16.188+0100    INFO    input/input.go:114      Starting input of type: log; ID: 14528820469630532633
2020-03-09T20:43:16.188+0100    INFO    cfgfile/reload.go:226   Loading of config files completed.
2020-03-09T20:43:45.712+0100    INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":530,"time":{"ms":532}},"total":{"ticks":1220,"time":{"ms":1231},"value":1220},"user":{"ticks":690,"time":{"ms":699}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":11},"info":{"ephemeral_id":"d3bd17e1-4687-47b2-adb9-3e37227a1f35","uptime":{"ms":33097}},"memstats":{"gc_next":1469469824,"memory_alloc":774951816,"memory_total":2411829032,"rss":1876811776},"runtime":{"goroutines":40}},"filebeat":{"events":{"added":6,"done":6},"harvester":{"files":{"df22dbab-e433-40b4-befd-3670a81bf52f":{"last_event_published_time":"","last_event_timestamp":"","name":"/home/user/Descargas/umbrella.20200303","size":364292477,"start_time":"2020-03-09T20:43:15.993Z"}},"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"type":"elasticsearch"},"pipeline":{"clients":3,"events":{"active":0,"filtered":6,"total":6}}},"registrar":{"states":{"current":3,"update":6},"writes":{"success":6,"total":6}},"system":{"cpu":{"cores":2},"load":{"1":3.16,"15":2.74,"5":3.07,"norm":{"1":1.58,"15":1.37,"5":1.535}}}}}}

To be honest, I dont know much about Elasticsearch. I configured ELK Stack and I would like to ingest Palo Alto logs (and another logs) into Elastic. This is my filebeat.yml file:

 [root@localhost filebeat]# cat filebeat.yml
#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*
    - /home/user/Descargas/paloalto.20200303
    - /home/user/Descargas/attack_download.log
    - /home/user/Descargas/umbrella.20200303

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

  ### Multiline options

  # Multiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  #multiline.pattern: ^\[

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  #multiline.negate: false

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  #multiline.match: after


#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging


#============================== Dashboards =====================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here or by using the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

#============================== Kibana =====================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

#============================= Elastic Cloud ==================================

# These settings simplify using Filebeat with the Elastic Cloud (https://cloud.elastic.co/).

# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:

# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
#output.logstash:
  # The Logstash hosts
  #hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

#================================ Processors =====================================

# Configure processors to enhance or manipulate events generated by the beat.

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
#logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]

#============================== X-Pack Monitoring ===============================
# filebeat can export internal metrics to a central Elasticsearch monitoring
# cluster.  This requires xpack monitoring to be enabled in Elasticsearch.  The
# reporting is disabled by default.

# Set to true to enable the monitoring reporter.
#monitoring.enabled: false

# Sets the UUID of the Elasticsearch cluster under which monitoring data for this
# Filebeat instance will appear in the Stack Monitoring UI. If output.elasticsearch
# is enabled, the UUID is derived from the Elasticsearch cluster referenced by output.elasticsearch.
#monitoring.cluster_uuid:
#monitoring.elasticsearch:

#================================= Migration ==================================

# This allows to enable 6.7 migration aliases
#migration.6_to_7.enabled: true
[root@localhost filebeat]#

I also enabled pawn module, and I configured it as follows:

[root@localhost modules.d]# cat panw.yml
# Module: panw
# Docs: https://www.elastic.co/guide/en/beats/filebeat/7.6/filebeat-module-panw.html

- module: panw
  panos:
    enabled: true

    # Set which input to use between syslog (default) or file.
    var.input: file

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/home/user/Descargas/paloalto.20200303"]

What shouId do to ingest the palo alto logs? maybe i´m doing something wrong.
Thank you very much for help.

You can try this way:
./filebeat -e -d input

... and touch a couple of files in your logged directories.

BTW with -d switch, you can enable particular log selectors. "*" enables all of them.

@mtojek thank you very much for answer.

About running filebeat with the command you said, will I be able to catch the palo alto logs to Elasticsearch with it?
Are my config files properly configured? I don´t know if I have to configure something more.

Thank you very much in advance.

Regards

@mtojek after using the command filebeat -e -d input I see the following output:

[root@localhost filebeat]# filebeat -e -d input
2020-03-10T21:07:57.773+0100    INFO    instance/beat.go:622    Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2020-03-10T21:07:57.773+0100    INFO    instance/beat.go:630    Beat ID: 7378502a-3d94-4710-99da-ccfc35ae4bcd
2020-03-10T21:08:00.777+0100    INFO    add_cloud_metadata/add_cloud_metadata.go:89     add_cloud_metadata: hosting provider type not detected.
2020-03-10T21:08:00.777+0100    INFO    [seccomp]       seccomp/seccomp.go:124 Syscall filter successfully installed
2020-03-10T21:08:00.777+0100    INFO    [beat]  instance/beat.go:958    Beat info       {"system_info": {"beat": {"path": {"config": "/etc/filebeat", "data": "/var/lib/filebeat", "home": "/usr/share/filebeat", "logs": "/var/log/filebeat"}, "type": "filebeat", "uuid": "7378502a-3d94-4710-99da-ccfc35ae4bcd"}}}
2020-03-10T21:08:00.777+0100    INFO    [beat]  instance/beat.go:967    Build info      {"system_info": {"build": {"commit": "c1c49432bdc53563e63e9d684ca3e9843626e448", "libbeat": "7.6.1", "time": "2020-02-28T23:12:26.000Z", "version": "7.6.1"}}}
2020-03-10T21:08:00.777+0100    INFO    [beat]  instance/beat.go:970    Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":2,"version":"go1.13.8"}}}
2020-03-10T21:08:00.778+0100    INFO    [beat]  instance/beat.go:974    Host info       {"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-03-10T20:29:33+01:00","containerized":false,"name":"localhost.localdomain","ip":["127.0.0.1/8","::1/128","192.168.0.29/24","fe80::1bd3:953f:8943:cdaf/64","192.168.122.1/24"],"kernel_version":"3.10.0-1062.12.1.el7.x86_64","mac":["08:00:27:fd:5b:47","52:54:00:19:dd:b5","52:54:00:19:dd:b5"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":7,"patch":1908,"codename":"Core"},"timezone":"CET","timezone_offset_sec":3600,"id":"08d134ea87a146a0b7436492458816c1"}}}
2020-03-10T21:08:00.778+0100    INFO    [beat]  instance/beat.go:1003   Process info    {"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend"],"ambient":null}, "cwd": "/etc/filebeat", "exe": "/usr/share/filebeat/bin/filebeat", "name": "filebeat", "pid": 5418, "ppid": 3325, "seccomp": {"mode":"filter","no_new_privs":true}, "start_time": "2020-03-10T21:07:56.920+0100"}}}
2020-03-10T21:08:00.779+0100    INFO    instance/beat.go:298    Setup Beat: filebeat; Version: 7.6.1
2020-03-10T21:08:00.779+0100    INFO    [index-management]      idxmgmt/std.go:182      Set output.elasticsearch.index to 'filebeat-7.6.1' as ILM is enabled.
2020-03-10T21:08:00.779+0100    INFO    elasticsearch/client.go:174     Elasticsearch url: http://localhost:9200
2020-03-10T21:08:00.779+0100    INFO    [publisher]     pipeline/module.go:110 Beat name: localhost.localdomain
2020-03-10T21:08:00.780+0100    INFO    [monitoring]    log/log.go:118  Starting metrics logging every 30s
2020-03-10T21:08:00.780+0100    INFO    instance/beat.go:439    filebeat start running.
2020-03-10T21:08:00.823+0100    INFO    registrar/registrar.go:145      Loading registrar data from /var/lib/filebeat/registry/filebeat/data.json
2020-03-10T21:08:00.823+0100    INFO    registrar/registrar.go:152      States Loaded from registrar: 3
2020-03-10T21:08:00.823+0100    INFO    crawler/crawler.go:72   Loading Inputs: 1
2020-03-10T21:08:00.824+0100    DEBUG   [input] log/config.go:204       recursive glob enabled
2020-03-10T21:08:00.824+0100    DEBUG   [input] log/input.go:164        exclude_files: []. Number of stats: 3
2020-03-10T21:08:00.824+0100    DEBUG   [input] file/states.go:68       New state added for /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.824+0100    DEBUG   [input] file/states.go:68       New state added for /home/user/Descargas/umbrella.20200303
2020-03-10T21:08:00.824+0100    DEBUG   [input] log/input.go:185        input with previous states loaded: 2
2020-03-10T21:08:00.824+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303 /home/user/Descargas/umbrella.20200303]
2020-03-10T21:08:00.824+0100    INFO    input/input.go:114      Starting input of type: log; ID: 13993002853132907255
2020-03-10T21:08:00.828+0100    DEBUG   [input] log/input.go:191        Start next scan
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: /home/user/Descargas/paloalto.20200303, offset: 2112756165
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:565        File didn't change: /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: /home/user/Descargas/umbrella.20200303
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: /home/user/Descargas/umbrella.20200303, offset: 0
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:520        Resuming harvesting of file: /home/user/Descargas/umbrella.20200303, offset: 0, new size: 364292477
2020-03-10T21:08:00.829+0100    DEBUG   [input] log/input.go:212        input states cleaned up. Before: 2, After: 2, Pending: 0
2020-03-10T21:08:00.829+0100    INFO    log/harvester.go:297    Harvester started for file: /home/user/Descargas/umbrella.20200303
2020-03-10T21:08:00.830+0100    DEBUG   [input] log/config.go:204       recursive glob enabled
2020-03-10T21:08:00.830+0100    DEBUG   [input] log/input.go:164        exclude_files: [(?-s:.)gz(?-m:$)]. Number of stats: 3
2020-03-10T21:08:00.830+0100    DEBUG   [input] file/states.go:68       New state added for /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.833+0100    DEBUG   [input] log/input.go:185        input with previous states loaded: 1
2020-03-10T21:08:00.836+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303]
2020-03-10T21:08:00.836+0100    INFO    crawler/crawler.go:106  Loading and starting Inputs completed. Enabled inputs: 1
2020-03-10T21:08:00.836+0100    INFO    cfgfile/reload.go:171   Config reloader started
2020-03-10T21:08:00.840+0100    DEBUG   [input] log/config.go:204       recursive glob enabled
2020-03-10T21:08:00.840+0100    DEBUG   [input] log/input.go:164        exclude_files: [(?-s:.)gz(?-m:$)]. Number of stats: 3
2020-03-10T21:08:00.840+0100    DEBUG   [input] file/states.go:68       New state added for /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.840+0100    DEBUG   [input] log/input.go:185        input with previous states loaded: 1
2020-03-10T21:08:00.840+0100    INFO    log/input.go:152        Configured paths: [/home/user/Descargas/paloalto.20200303]
2020-03-10T21:08:00.840+0100    INFO    elasticsearch/client.go:174     Elasticsearch url: http://localhost:9200
2020-03-10T21:08:00.843+0100    INFO    elasticsearch/client.go:757     Attempting to connect to Elasticsearch version 7.6.0
2020-03-10T21:08:00.874+0100    INFO    [license]       licenser/es_callback.go:50      Elasticsearch license: Basic
2020-03-10T21:08:00.881+0100    INFO    input/input.go:114      Starting input of type: log; ID: 14528820469630532633
2020-03-10T21:08:00.881+0100    INFO    cfgfile/reload.go:226   Loading of config files completed.
2020-03-10T21:08:00.881+0100    DEBUG   [input] log/input.go:191        Start next scan
2020-03-10T21:08:00.882+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.882+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: /home/user/Descargas/paloalto.20200303, offset: 2112756165
2020-03-10T21:08:00.882+0100    DEBUG   [input] log/input.go:565        File didn't change: /home/user/Descargas/paloalto.20200303
2020-03-10T21:08:00.882+0100    DEBUG   [input] log/input.go:212        input states cleaned up. Before: 1, After: 1, Pending: 0

No data is being indexed in Elasticsearch... I don´t know what´s happening here. Could you help me, please?
Thank you very much

As I said, try touching some of this files, creating new ones in directories. You can also record some network traffic using tcpdump to see if your logs are being sent to ES.

I don´t know what you mean with "touching". Do you mean to modify the Palo Alto log file?
Is there any problem with configuration in Filebeat?

Oh, you mean to create blank files in the directory or with some content. Actually, there are also another files in the directory, but the thing is to first ingest Palo Alto logs, and then try anothers. I will try to extract new one Palo Alto log, and trying again to run filebeat....
It should ingest it in Elasticsearch automatically, right? Filebeat configuration is fine?

Thank you very much!

@mtojek I did what you said, I modify the directory. When I index data, It is not parsed. This is the data I saw in Kibana:

@timestamp:  Mar 11, 2020 @ 19:33:30.490 host.id: 08d134ea87a146a0b7436492458816c1 host.containerized: false host.hostname: localhost.localdomain  
host.architecture: x86_64 host.os.name: CentOS Linux host.os.kernel: 3.10.0-1062.12.1.el7.x86_64
host.os.codename: Core host.os.platform: centos host.os.version: 7 (Core) host.os.family: redhat host.name: localhost.localdomain agent.id: 7378502a-3d94-4710-99da-ccfc35ae4bcd agent.version: 7.6.1 agent.type: filebeat agent.ephemeral_id: 47eb6b02-45f8-429e-b476-c6c01db5bfec agent.hostname: localhost.localdomain ecs.version: 1.4.0 log.offset: 490,382,016 log.file.path: /home/user/Descargas/paloalto2 message:<14>Mar 3 11:41:15 google LEEF:1.0|Palo Alto Networks|PAN-OS Syslog Integration|8.1.10|deny|cat=TRAFFIC| ReceiveTime=2020/03/03.... 

What Im doing wrong? it should get it automatically, right?
There are even some events in Kibana which fields are timestamp and message... so parsing is not going well.

Thanks

What do you see in ElasticSearch? You can use API to directly access documents stored in ES.

@mtojek, I see the fields I showed you in the last reply. The fields of "message" are not getting parsed properly, thats the main problem. Now data is being indexed in Elasticsearch, but Filebeat is not parsing the fields and values of the "message".
Can you help me?

I assumed that you configured ingest pipeline for panw/panos? Please compare your logs with files in "test" directory (e.g. traffic.log, pan_inc_traffic.log). Make sure that you're using the same field sepator or reconfigure to yours.

I followed the documentation:"The ingest pipelines used to parse log lines are set up automatically the first time you run the module, assuming the Elasticsearch output is enabled."
So I guess pipelines are going automatically since my output is Elasticsearch, and not Logstash.
I showed my filebeat.yml config in lasts messages. I enabled pawn module, and it´s configured like I showed in lasts messages.
Anyway, I run the command "filebeat setup --pipelines --modules panw" when I configured it.

I only want to parse a pan-os log file, other files are in different format and need Logstash to parse them. So the focus is only one file.
The thing is, Im able to ingest data to Elasticsearch from pan-os log, but the data is not having parsed in field:value. I just see the field "message" with the full information of the log, but not in separated fields, as it should be. So, the problem is that Filebeat pawn module is not parsing properly the pan-os log. Pan-os log is the only file I want to ingest in Elasticsearch with Filebeat.

Maybe I missconfigured something, but that´s what Im trying to say.

Thanks.

{
  "_index": "filebeat-7.6.1-2020.03.10-000001",
  "_type": "_doc",
  "_id": "HXJe0HABrJOeGnf1wD6o",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2020-03-12T20:10:32.472Z",
    "message": "<14>Mar  3 01:11:41 google LEEF:1.0|Palo Alto Networks|PAN-OS Syslog Integration|8.1.10|allow|cat=TRAFFIC| ReceiveTime=2020/03/03 01:11:40|SerialNumber=001801044140|Type=TRAFFIC|Subtype=end|devTime=$cef-formatted-receive_ time|src=0.0.0.0|dst=ip|srcPostNAT=ip|dstPostNAT=ip|RuleName=RP-OUT-03|usrName=| SourceUser=|DestinationUser=|Application=soap|VirtualSystem=vsys1|SourceZone=Produccion| DestinationZone=Outside|IngressInterface=ethernet1/1|EgressInterface=ethernet2| LogForwardingProfile=SIEM|SessionID=1|RepeatCount=1|srcPort=40679|dstPort=3129| srcPostNATPort=0|dstPostNATPort=0|Flags=0x144000|proto=tcp|action=allow| totalBytes=27235|dstBytes=7380|srcBytes=19855|totalPackets=43|StartTime=2020/03/03 01:09:28| ElapsedTime=11|URLCategory=license-expired|sequence=7768340108|ActionFlags=0x0| SourceLocation=ip|DestinationLocation=Poland|dstPackets=18|srcPackets=25| SessionEndReason=aged-out|DeviceGroupHierarchyL1=0| DeviceGroupHierarchyL2=0|DeviceGroupHierarchyL3=0| DeviceGroupHierarchyL4=0|vSrcName=|DeviceName=user| ActionSource=from-policy|SrcUUID=|DstUUID=|TunnelID=0| MonitorTag=|ParentSessionID=0|ParentStartTime=| TunnelType=N/A",
    "input": {
      "type": "log"
    },
    "event": {
      "module": "panw",
      "dataset": "panw.panos",
      "timezone": "+01:00"
    },
    "host": {
      "name": "localhost.localdomain",
      "hostname": "localhost.localdomain",
      "architecture": "x86_64",
      "os": {
        "platform": "centos",
        "version": "7 (Core)",
        "family": "redhat",
        "name": "CentOS Linux",
        "kernel": "3.10.0-1062.12.1.el7.x86_64",
        "codename": "Core"
      },
      "id": "08d134ea87a146a0b7436492458816c1",
      "containerized": false
    },
    "log": {
      "file": {
        "path": "/home/user/Descargas/paloalto2"
      },
      "offset": 2075354276
    },
    "tags": [
      "pan-os"
    ],
    "fileset": {
      "name": "panos"
    },
    "service": {
      "type": "panw"
    },
    "agent": {
      "version": "7.6.1",
      "type": "filebeat",
      "ephemeral_id": "c28e5b8c-dbe7-46b6-8719-da8cec5feba0",
      "hostname": "localhost.localdomain",
      "id": "7378502a-3d94-4710-99da-ccfc35ae4bcd"
    },
    "ecs": {
      "version": "1.4.0"
    }
  },
  "fields": {
    "suricata.eve.timestamp": [
      "2020-03-12T20:10:32.472Z"
    ],
    "@timestamp": [
      "2020-03-12T20:10:32.472Z"
    ]
  },
  "sort": [
    1584043832472
  ]
}

Please hit this URL address: http://localhost:9200/_ingest/pipeline . Can you find a description "Pipeline for Palo Alto Networks PAN-OS Logs" ?

If not, you haven't configured the pipeline, so:

filebeat modules enable panw
filebeat setup -e --pipelines -d '*'

Yes, I found it:

{"filebeat-7.6.1-panw-panos-pipeline":{"description":"Pipeline for Palo Alto Networks PAN-OS Logs","processors":[{"rename":{"target_field":"log.original","field":"message"}},{"date":{"field":"_temp_.generated_time","formats":["yyyy/MM/dd HH:mm:ss"],"on_failure":[{"append":{"field":"error.message","value":"{{ _ingest.on_failure_message }}"}}],"if":"ctx.event.timezone == null"}},{"date":{"if":"ctx.event.timezone != null","field":"_temp_.generated_time","formats":["yyyy/MM/dd HH:mm:ss"],"timezone":"{{ event.timezone }}","on_failure":[{"append":{"value":"{{ _ingest.on_failure_message }}","field":"error.message"}}]}},{"date":{"formats":["yyyy/MM/dd HH:mm:ss"],"on_failure":[{"append":{"field":"error.message","value":"{{ _ingest.on_failure_message }}"}}],"if":"ctx.event.timezone == null && ctx.event.created != null ","field":"event.created","target_field":"event.created"}},{"date":{"field":"event.created","target_field":"event.created","formats":["yyyy/MM/dd HH:mm:ss"],"timezone":"{{ event.timezone }}","on_failure":[{"append":{"field":"error.message","value":"{{ _ingest.on_failure_message }}"}}],"if":"ctx.event.timezone != null && ctx.event.created != null "}},{"date":{"formats":["yyyy/MM/dd HH:mm:ss"],"on_failure":[{"append":{"value":"{{ _ingest.on_failure_message }}","field":"error.message"}}],"if":"ctx.event.timezone == null && ctx.event.start != null","field":"event.start","target_field":"event.start"}},{"date":{"on_failure":[{"append":{"field":"error.message","value":"{{ _ingest.on_failure_message }}"}}],"if":"ctx.event.timezone != null && ctx.event.start != null","field":"event.start","target_field":"event.start","timezone":"{{ event.timezone }}","formats":["yyyy/MM/dd HH:mm:ss"]}},{"convert":{"type":"long","ignore_missing":true,"field":"client.bytes"}},{"convert":{"ignore_missing":true,"field":"client.packets","type":"long"}},{"convert":{"type":"long","ignore_missing":true,"field":"client.port"}},{"convert":{"type":"long","ignore_missing":true,"field":"server.bytes"}},{"convert":{"type":"long","ignore_missing":true,"field":"server.packets"}},{"convert":{"type":"long","ignore_missing":true,"field":"server.port"}},{"convert":{"field":"source.bytes","type":"long","ignore_missing":true}},{"convert":{"ignore_missing":true,"field":"source.packets","type":"long"}},{"convert":{"field":"source.port","type":"long","ignore_missing":true}},{"convert":{"ignore_missing":true,"field":"destination.bytes","type":"long"}},{"convert":{"field":"destination.packets","type":"long","ignore_missing":true}},{"convert":{"ignore_missing":true,"field":"destination.port","type":"long"}},{"convert":{"type":"long","ignore_missing":true,"field":"network.bytes"}},{"convert":{"ignore_missing":true,"field":"network.packets","type":"long"}},{"convert":{"type":"long","ignore_missing":true,"field":"event.duration"}},{"convert":{"field":"_temp_.labels","type":"long","ignore_missing":true}},{"convert":{"ignore_missing":true,"field":"panw.panos.sequence_number","type":"long"}},{"convert":{"field":"source.nat.port","type":"long","ignore_missing":true}},{"convert":{"field":"destination.nat.port","type":"long","ignore_missing":true}},{"remove":{"if":"ctx?.panw?.panos?.network?.pcap_id == \"0\"","field":["panw.panos.network.pcap_id"]}},{"script":{"if":"ctx?._temp_?.labels != null && ctx._temp_.labels != 0","params":{"ssl_decrypted":16777216,"symmetric_return":2048,"pcap_included":2147483648,"ipv6_session":33554432,"container_page":32768,"nat_translated":4194304,"x_forwarded_for":524288,"http_proxy":262144,"captive_portal":2097152,"temporary_match":8192,"url_filter_denied":8388608},"source":"def labels = ctx?.labels; if (labels == null) {\n labels = new HashMap();\n ctx['labels'] = labels;\n} long value = ctx._temp_.labels; for (entry in params.entrySet()) {\n if ((value & entry.getValue()) != 0) {\n labels[entry.getKey()] = true;\n }\n}\n","lang":"painless"}},{"script":{"params":{"NANOS_IN_A_SECOND":1000000000},"source":"long nanos = ctx['event']['duration'] * params.NANOS_IN_A_SECOND; ctx['event']['duration'] = nanos; def start = ctx.event?.start; if (start != null) {\n ctx.event['end'] = ZonedDateTime.parse(start).plusNanos(nanos);\n}\n","lang":"painless","if":"ctx?.event?.duration != null"}},{"set":{"if":"ctx?._temp_?.message_type == \"TRAFFIC\" && ctx?.panw?.panos?.source?.zone == \"untrust\" && ctx?.panw?.panos?.destination?.zone == \"trust\"","field":"network.direction","value":"inbound"}},{"set":{"if":"ctx?._temp_?.message_type == \"TRAFFIC\" && ctx?.panw?.panos?.source?.zone == \"trust\" && ctx?.panw?.panos?.destination?.zone == \"untrust\"","field":"network.direction","value":"outbound"}},{"set":{"if":"ctx?._temp_?.message_type == \"TRAFFIC\" && ctx?.panw?.panos?.source?.zone == \"trust\" && ctx?.panw?.panos?.destination?.zone == \"trust\"","field":"network.direction","value":"internal"}},{"set":{"if":"ctx?._temp_?.message_type == \"TRAFFIC\" && ctx?.panw?.panos?.source?.zone == \"untrust\" && ctx?.panw?.panos?.destination?.zone ==

Panw module is properly installed. The main problem is the data is not getting parsed properly to Elasticsearch...

You can try this approach to define a pipeline: https://www.elastic.co/guide/en/beats/filebeat/7.6/configuring-ingest-node.html