Add_docker_metadata failing to get container id if logs are in different path

Setup:

  • docker logs are not in the default folder "/var/lib/docker/containers//-json.log" but in "/home/var/docker/data/containers//-json.log"
  • Filebeat works just fine and pushes docker logs into elasticsearch

Problem:

  • "docker.container.id" always says "containers" instead of the actual ids
  • This leads to "add_docker_metadata:" errors, since the container "containers" doesn't exist

It's as if, when trying to get the id of the containers, it's trying to get the 5th item in the path, but since in my case it's the 6th, it just gets "containers" (the actual 5th item in the path).

Is this hardcoded somewhere or something? Any idea how to solve this?

As a note, I added the path in the containers struct definition, which gives me this as input for the filebeat.yml

- type: docker
  enabled: true
  containers:
    ids:
      - "*"
    path: "/home/var/docker/data/containers"
  scan_frequency: 10s
  processors:
  - add_docker_metadata:
      host: "unix:///var/run/docker.sock"

And issue is the same with input:log

- type: log
  enabled: true
  paths:
    - /home/var/docker/data/containers/*/*-json.log
  scan_frequency: 10s
  processors:
  - add_docker_metadata:
      host: "unix:///var/run/docker.sock"

"docker.container.id" is always "containers"
and in the logs for filebeat, there is this all the time

2018-08-08T10:38:29.730+0200 DEBUG [add_docker_metadata] add_docker_metadata/add_docker_metadata.go:169 Container not found: cid=containers

Ok, I found the problem

It's HARDCODED that ids should be in the 5th position of the path..

see github

alright, my bad. Checking the source more closely, you can actually inject parameters for this using not sourceindex but match_source_index, like you can change the host.

 type: log
  enabled: true
  paths:
    - /home/var/docker/data/containers/*/*-json.log
  scan_frequency: 10s
  processors:
  - add_docker_metadata:
      match_source_index: 5

close this thread whenever

1 Like

Glad you found a solution and thanks for sharing it here.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.