Elastic Agent kubernetes container logs not shipping from nodes

I'm quite fresh with elastic stack and I have a problem with elastic agents on k8.

Setup:
Both Elastic Stack (v8.7) and k8 (v1.25 with cri-o; 1 master, 3 nodes) are deployed on local environment. Elastic Stack deployed outside of any cluster. We have 3 k8 clusters (same config for all), all clusters behave the same, regardless of subnet (one cluster is on the same subnet as elastic, others are on different subnets). All clusters have kube-state-metrics deployed on them, as per prerequisites of kubernetes integration.

When deploying agents on k8 with provided .yaml daemonset from elastic, agents rollout fine, there is a healthy status. Policy is configured with system and kubernetes integrations.

In Infrastructure view all containers are populated correctly, metrics seem to populate just fine. But when I'm trying to get container logs, I'm only able to get logs from master and node-01, regardless of cluster.

System logs are sent just fine.

Pods have /var/log/ path mounted correctly, have access to /var/log/containers/ path and logs in it. I can cat them just fine from inside of the pod, so it's not a permission issue.

In elastic-agent diagnostics report, in \components\filestream-default there are outputs for pods from node-01 on reports from node-02 and node-03 agents, in logs I see only:
"Node k8-01 discovered by machine-id matching" - no entry for node 02 or 03

Tried redeploying agents, deploying standalone agents, with no avail.

Any suggestions where to look now?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.