Fetching application logs through filebeat in kubernetes environment

I want to collect logs of a golang application thorugh filebeat and send it to logstash. I have a kubernetes cluster. I have following questions regarding deployments:

  1. Can I run both application and filebeat through a docker-compose file so that when it deployed it start taking application logs? Any reference also will be helpful.

  2. How to deploy logstash in kubernetes cluster and how to get it to communicate with filebeat? Any reference will be helpful.

Thanks

See if these links help
https://www.elastic.co/guide/en/beats/filebeat/current/running-on-kubernetes.html

I came across this article. It does not answer the first question. It might answer the second one.
Anyway thanks

Hi,

to your second question....you can use helm to install logstash on your kubernetes cluster.
This is the link to the logstash helm chart https://github.com/helm/charts/tree/master/stable/logstash and this is the link to the helm k8s package manager for installation/usage details https://helm.sh/ .

With this helm command I install logstash on my kubernetes cluster helm install stable/logstash --name logstash --namespace logging -f logstash_custom_values.yml --version 1.4.2

You also has to tell filebeat (filebeat.yml) where it can find logstash.

  output:
    file:
      enabled: false
    logstash:
      hosts: ["logstash:5044"]
      index: "filebeat-app"

Did you look at both the links??

The first link solves first point I guess.
https://www.elastic.co/guide/en/beats/filebeat/current/running-on-kubernetes.html

Thanks for the answer. I will look into that. Meanwhile, can you tell me the process of putting application logs into filebeat if both application and filebeat are running in different containers? This is in context to my first question.

You can just use filebeat autodiscover to tell filebeat which container logs should be taken from your kubernetes cluster

filebeat.autodiscover:
    providers:
      - type: kubernetes
        templates:
          - condition:
              and:
                - equals:
                    kubernetes.namespace: dev
                - contains:
                    kubernetes.labels.app: redis
            config:
              - module: redis
                log:
                  enabled: true
                  input:
                    type: docker
                    fields:
                      category: app-redis-dev
                    fields_under_root: true
                    containers.ids:
                      - "${data.kubernetes.container.id}"
                slowlog:
                  enabled: false
          - condition:
              and:
                - equals:
                    kubernetes.namespace: logging
                - contains:
                    kubernetes.labels.app: logstash
            config:
              - module: logstash
                log:
                  enabled: true
                  input:
                    type: docker
                    fields:
                      category: app-logstash-dev
                    fields_under_root: true
                    containers.ids:
                      - "${data.kubernetes.container.id}"
                slowlog:
                  enabled: false

You can try using EFS or NFS mount, which can be shared across the containers.

Other option is Scribe to aggregate logs in single container

Ohh, I didnt know about this feature of filebeat. Thanks for the answer.

Right now I have a flask application running in a container. I can see the request logs by running the docker log command. So, I wanted the filebeat (running in separate container) to take these logs because the container logs are stored in host os only, Right?
Is this the right approach if not in kubernetes environment? Can I use docker compose for this case?

Also, Where in macOS the logs files of each container are stored?

Cant use scribe. Have to use elastic stack only.

Put your logs on EFS or NFS and mount on both containers.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.