hello, im having problems with setting up a working filebeat log shipper for docker instances with different input config files for different containers.
my goal is to collect an process docker logs individually from different containers running on one vm. i want filebeat to send those to the elastic cloud.
the only thing i can get to work is a setup for a single container in the main filebeat.yml
i m looking for the specific docker image and run a specific processing. e.g.
filebeat.inputs:
- type: log
enabled: false
paths:
- /var/log/*.log
filebeat.autodiscover:
providers:
- type: docker
templates:
- condition:
equals.docker.container.image: imagename
config:
- type: container
paths:
- /var/lib/docker/containers/${data.docker.container.id}/*.log
# ============================== Filebeat modules ==============================
#filebeat.config.modules:
# path: ${path.config}/modules.d/*.yml
# reload.enabled: false
#filebeat.config.inputs:
# enabled: true
# path: /etc/filebeat/inputs.d/*.yml
#
# ======================= Elasticsearch template setting =======================
#setup.template.settings:
# index.number_of_shards: 1
# ================================== General ===================================
#name:
tags: ["development", "shop", "backend"]
fields:
env: development
app: shopbackend
# =================================== Kibana ===================================
#setup.kibana:
# =============================== Elastic Cloud ================================
cloud.id: "${USERCLOUD}:${PWCLOUD}"
cloud.auth: "${USER}:${PW}"
# ================================== Outputs ===================================
# ---------------------------- Elasticsearch Output ----------------------------
#output.console:
# pretty: true
# ================================= Processors =================================
processors:
- dissect:
tokenizer: "%{date} %{time} %{level} %{app} %{pid} - %{class} : %{message}"
field: "message"
target_prefix: "log"
trim_values: "right"
- drop_fields:
fields: ["log.date", "log.time"]
ignore_missing: true
# ================================== Logging ===================================
logging.level: info
i have tried with filebeat.config.inputs: enabled: true path: /etc/filebeat/inputs.d/*.yml and defined .yml files for other containers but no setup containing more than the main-file works.
it would help me if someone could show me how to define this specific container logfiltering with a container-specific config file in inputs.d/ while the main filebeats.yml is just pointing to it and doing the cloud auth.
id like to have specific yml files for each container with individual processing under inputs.d/
in case thats possible at all