Filebeat.yml config setup for individual log and processing of different docker containers

hello, im having problems with setting up a working filebeat log shipper for docker instances with different input config files for different containers.

my goal is to collect an process docker logs individually from different containers running on one vm. i want filebeat to send those to the elastic cloud.

the only thing i can get to work is a setup for a single container in the main filebeat.yml
i m looking for the specific docker image and run a specific processing. e.g.

filebeat.inputs:
- type: log
  enabled: false
  paths:
    - /var/log/*.log
filebeat.autodiscover:
  providers:
    - type: docker
      templates:
        - condition:
            equals.docker.container.image: imagename
          config:
            - type: container
              paths:
                - /var/lib/docker/containers/${data.docker.container.id}/*.log
# ============================== Filebeat modules ==============================
#filebeat.config.modules:
#  path: ${path.config}/modules.d/*.yml
#  reload.enabled: false
#filebeat.config.inputs:
#  enabled: true
#  path: /etc/filebeat/inputs.d/*.yml
#
# ======================= Elasticsearch template setting =======================
#setup.template.settings:
#  index.number_of_shards: 1
# ================================== General ===================================
#name:
tags: ["development", "shop", "backend"]
fields:
  env: development
  app: shopbackend
# =================================== Kibana ===================================
#setup.kibana:
# =============================== Elastic Cloud ================================
cloud.id: "${USERCLOUD}:${PWCLOUD}"
cloud.auth: "${USER}:${PW}"
# ================================== Outputs ===================================
# ---------------------------- Elasticsearch Output ----------------------------
#output.console:
#  pretty: true
# ================================= Processors =================================
processors:
  - dissect:
      tokenizer: "%{date} %{time}  %{level} %{app} %{pid} - %{class} : %{message}"
      field: "message"
      target_prefix: "log"
      trim_values: "right"
  - drop_fields:
      fields: ["log.date", "log.time"]
      ignore_missing: true
# ================================== Logging ===================================
logging.level: info

i have tried with filebeat.config.inputs: enabled: true path: /etc/filebeat/inputs.d/*.yml and defined .yml files for other containers but no setup containing more than the main-file works.

it would help me if someone could show me how to define this specific container logfiltering with a container-specific config file in inputs.d/ while the main filebeats.yml is just pointing to it and doing the cloud auth.
id like to have specific yml files for each container with individual processing under inputs.d/

in case thats possible at all

Can you try to use contains instead of equals in your condition clause?

If that doesn't help, using output.console instead of output.elasticsearch, could you post the an event that's output to your console here?

yes, i think i could use contains.docker.container.image: "shop-" or similar to find all containers of the e.g. shopdeployment. but that does not help to define different processings and tags(fields) per container as they have different log-outputs.

the above is working for a single instance, but im looking for a differenciated method like the below pseudo config which doesnt work:

filebeat.yml
-hostlogs
-filebeat.config.inputs:
enabled: true
path: /etc/filebeat/inputs.d/*.yml
-coudauth

inputs.d/container-a.yml
-imagename-a
-processors-a
/container-b.yml
-imagename-b
-processors-b
/container-c
-...

i tried inputs like this, starting with 'type' as stated here:

#inputs.d/container-a.yml example file 
    filebeat.inputs:
    - type: log
      enabled: true                                                                                                                                                                                                                                                                               filebeat.autodiscover:
      providers:
        - type: docker
          templates:
            - condition:
                equals.docker.container.image: imagename-a
              config:
                - type: container
                  paths:
                    - /var/lib/docker/containers/${data.docker.container.id}/*.log
#tagging image
    tags: ["development", "shop", "backend"]
    fields:
      env: development
      app: shop.services
#processors for image-a
    processors:
      - dissect:
          tokenizer: "%{date} %{time}  %{level} %{app} %{pid} - %{class} : %{message}"
          field: "message"
          target_prefix: "log"
          trim_values: "right"
      - drop_fields:
          fields: ["log.date", "log.time"]
          ignore_missing: true

    logging.level: info

could someone provide a working example where inputs are separatedly configured in inputs.d?
i m eighter getting no logs or a failed to start/ exit code 1 when i separate the configs

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.