How to add field for logs from specific file

Hello,

I'm fresh user of ELK, i would like to read logs from different files and use grok's filter only for certain log/file. My setup looks like this:
firewall logs -> rsyslog -> file -> filebeat -> logstash -> Elastic/Kibana

If i understand correctly i should add field in filebeat configuration and afterwards in logstash statemant # if [type] == "firewall" then .. and filter configuration
I couldn't find filebeat.yml config for that, i was trying like this:

- type: log
  enabled: true
  paths:
    - /var/log/fw/*.log
    fields:
    type: firewall

- type: log
  enabled: true
  paths:
   - /var/log/sw/*.log
   - /var/log/pxy/*.log
   - /var/log/srv/*.log

 processors:
      - add_host_metadata: ~
      - add_cloud_metadata: ~
      - add_docker_metadata: ~
      - add_kubernetes_metadata: ~
     - add_fields:
       target: ''
       fields:
         name: type
         id: '999999999'

but it doesn't work. thanks for any help

Pedro

I think you are mostly there! Based on what you already have, you could solve this in one of two ways:

  1. You could move your add_fields processor section under the first input's configuration. Processors can be defined globally (like you have) but also per-input. See https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html#where-valid.

  2. You could leave your add_fields processor section where it is (in the global list of processors) but then you probably want to add a conditional configuration section under it, so only type: firewall events are processed by that processor. See https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html#defining-processors and https://www.elastic.co/guide/en/beats/filebeat/current/defining-processors.html#conditions.

Hope that helps,

Shaunak