Hi, I've been trying to set up kubernetes logging with filebeat from outside our kubernetes cluster. Here is my config:
filebeat:
  autodiscover:
    providers:
      - type: kubernetes
        kube_config: /etc/kubernetes/kubelet-kubeconfig.yml
        in_cluster: false
        templates:
          - condition:
              regexp:
                kubernetes.namespace: ".*"
            config:
              - type: docker
                include_annotations: true
                containers.ids:
                  - "${data.kubernetes.container.id}"
logging:
  files:
    keepfiles: 7
    name: filebeat.log
    path: /var/log/filebeat
    permissions: '0644'
    rotateeverybytes: 104857600
  level: debug
  to_files: true
output:
  file:
    path: /tmp/filebeat
Filebeat doesn't seem to even create the output file.
Logs here (due to body size limit..): https://pastebin.com/ry0bvA6f
To me it looks like it does find at least some pods, even though only for kube-system namespace while I have others running there too. But no output from those produced either.
I've tried previously without the autodiscover:
filebeat:
  prospectors:
  - fields:
      kubeenv: dev1
      type: kubelog
    fields_under_root: true
    json:
      keys_under_root: true
      message_key: log
    paths:
    - /var/lib/docker/containers/*/*.log
    processors:
    - add_kubernetes_metadata:
        in_cluster: false
        kube_config: /etc/kubernetes/kubelet-kubeconfig.yml
    type: log
and that seems to be working though, but there's an issue with that: https://github.com/elastic/beats/issues/5377
Any ideas what possibly could be wrong?