Filebeat on kubernetes not pulling logs from all pods

Hi,
I am running file beat as daemonset on the k8s. It is pulling logs from few pods and not all. Here is the simple config used.

filebeat.autodiscover:
      providers:
        - type: kubernetes
          node: ${NODE_NAME}
          hints.enabled: true
          hints.default_config:
            type: container
            paths:
              - /var/lib/docker/containers/${data.kubernetes.container.id}/*.log

    filebeat.config.modules:
      path: ${path.config}/modules.d/*.yml
    filebeat.modules:
    - module: kafka
 
    processors:
      - add_cloud_metadata:
      - add_host_metadata:
              
    output.elasticsearch:
      host: '${NODE_NAME}'
      hosts: '${ELASTICSEARCH_HOSTS:elasticsearch-master:9200}'
    setup.kibana:
      host: 'kibana-kibana.default.svc.cluster.local:5601'
    setup.dashboards.enabled: true 

Any help is appreciated.
Filebeat version: 7.7.0

When I looked at /var/lib/docker/containers/ under filebeat (kubectl exec -it filebeat-filebeat-9phcf -- ls -l /var/lib/docker/containers) , not every pod log is in there. Is filebeat mounting as expected. What else I could check.

Regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.