Filebeat not reading logs 🤨

Hi, I have a Kubernetes cluster, where in pods I should read logs of one application (Airflow) located in such structure: /usr/local/airflow/logs/{jobname}/{jobname}/{timestamp}/1.log (example: /usr/local/airflow/logs/somejob/somejob/2022-03-31T17:21:44.563123+00:00/1.log).
Seems easy, I just need to use Kubernetes autodiscover with this template of path /usr/local/airflow/logs/**/*.log, but in filebeat logs I see that harvester not starting reading files located there. My filebeat.yml:

filebeat.autodiscover:
  providers:
    - type: kubernetes
      templates:
        - condition:
            equals.kubernetes.container.name: airflow-web
          config:
            - type: container
              paths:
                - /usr/local/airflow/logs/**/*.log
              tail_files: true
output.logstash:
  hosts: ["logstash:22004"]

Funny fact that my filebeat cant read files located in /usr/local/airflow/logs/somejob/somejob/2022-03-31T17:21:44.563123+00:00/1.log but when I'm trying to read containers logs in path /var/log/containers/*-${data.kubernetes.container.id}.log I can see containers logs in Kibana, so this problem is not with logstash output. But I really need logs in /usr/local/airflow/logs/somejob/somejob/2022-03-31T17:21:44.563123+00:00/1.log

Maybe I missed something :face_with_monocle: Thanks in advance :wink: P.S. Container name is airflow-web (that's for sure).

Could you please share the debug logs of Filebeat?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.