Filebeat not collecting docker logs

I am running an ELK stack as 3 separate containers running locally (kibana, logstash, elasticsearch). I am running Docker Desktop for Windows (though I plan to migrate this entire setup to AWS).

I am also running a java microservice separately in another container, and I've added a Filebeat container to the same docker-compose.yaml in order to collect logs from that microservice and forward the logs to ELK.

The ELK stack runs fine but Filebeat is not collecting any logs. I've enabled DEBUG log level in Filebeat, which is showing 0 log files being harvested. What am I doing wrong?

Here is my filebeat.yml

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

output.elasticsearch:
  hosts: ["127.0.0.1:9200"]
setup.kibana:
  host: "127.0.0.1:5601"

logging.level: info
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat.log
  keepfiles: 7
ssl.verification_mode: none

filebeat Dockerfile:

FROM docker.elastic.co/beats/filebeat:7.5.1
COPY filebeat.yml /usr/share/filebeat/filebeat.yml
USER root
RUN chown root:filebeat /usr/share/filebeat/filebeat.yml
USER filebeat

docker-compose.yml:

version: '3.5'

services:
      my-app:
        build:
            context: .
            dockerfile: Dockerfile
        ports:
            # http port
            - "9080:8080"

      filebeat:
        container_name: filebeat
        user: root
        image: filebeat:latest
        volumes:
            - /var/run/docker.sock:/var/run/docker.sock
            - /var/lib/docker/containers:/hostfs/var/lib/docker/containers
        command: filebeat -e -E output.elasticsearch.username=elastic -E output.elasticsearch.password=changeme -strict.perms=false


    default:
        external:
            name: my-network

And here are the DEBUG logs from Filebeat. It shows that it's actually seeing the container and grabbing its ID.

filebeat            | 2019-12-19T21:40:33.089Z  INFO    log/input.go:152        Configured paths: [/var/lib/docker/containers/986f0c5d737f075322c71e8b59c8ad939e345f5891fb2c659ad7a55fa77a479e/*-json.log]
filebeat            | 2019-12-19T21:40:33.089Z  DEBUG   [autodiscover]  cfgfile/list.go:101     Starting runner: input [type=container, ID=3747588001660246069]
filebeat            | 2019-12-19T21:40:33.089Z  INFO    input/input.go:114      Starting input of type: container; ID: 3747588001660246069
filebeat            | 2019-12-19T21:40:33.090Z  DEBUG   [input] log/input.go:191        Start next scan
filebeat            | 2019-12-19T21:40:33.090Z  DEBUG   [input] log/input.go:212        input states cleaned up. Before: 0, After: 0, Pending: 0

Any replies?

I should mention that my microservice application that is emitting container logs goes straight to stdout. I can see the logs right in the console -- why is Filebeat not picking it up?

For debugging purposes could you temporarily comment out the output.elastiscsearch section in your filebeat.yml and set this instead?

output.console:
  enabled: true

Then restart Filebeat and check if it's container logs are showing events corresponding to logs from your microservice container?

I added the output.console but that didn't seem to make any difference.

I removed autodiscover from the config and instead changed it to use inputs:

filebeat.yml:

filebeat.inputs:
- type: container
  combine_partial: true
  paths: 
    - "/var/lib/docker/containers/*/*.log"
  containers.ids:
    - "*"

output.console:
  enabled: true

logging.level: debug
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat.log
  keepfiles: 7

Same result from the DEBUG output, nothing is getting picked up at all.

OK well I solved the issue, I just had a wrong path specified under VOLUMES section in my docker-compose.yml.

        volumes:
            - /var/run/docker.sock:/var/run/docker.sock
            - /var/lib/docker/containers:/var/lib/docker/containers

Now I am getting a different error sending the harvested files to Logstash which I will investigate next.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.