Log Path for ECK Agent Daemonset

I deployed the Kubernetes integration to an ECK daemonset deployment version 8.2.2 in Amazon EKS running on Bottlerocket hosts. I’m not getting any container or audit logs. When I exec into one of the agent pods, there is no path for /var/log/kubernetesor/var/log/containers` . Is this expected or do I need to update the paths to something else?


By default, Kubernetes integration collect logs using the /var/log/containers/*${kubernetes.container.id}.log path and the audit logs on /var/log/kubernetes/kube-apiserver-audit.log, if you are sending logs to another path you should adjust it accordingly,

You can read more about it here, Kubernetes | Elastic Documentation

Correct, however it appears that the problem is the default deployment of the agent via the operator does not include the volume mount/claims to be able to read those paths. I'm validating this now, but it appears that the following needs to be manually added for any ECK managed agents that will be configured to use the Kubernetes integration package:

        - name: agent
          - mountPath: /var/lib/docker/containers
            name: varlibdockercontainers
          - mountPath: /var/log/containers
            name: varlogcontainers
          - mountPath: /var/log/pods
            name: varlogpods
        - name: varlibdockercontainers
            path: /var/lib/docker/containers
        - name: varlogcontainers
            path: /var/log/containers
        - name: varlogpods
            path: /var/log/pods