I want to collect logs of a golang application thorugh filebeat and send it to logstash. I have a kubernetes cluster. I have following questions regarding deployments:
Can I run both application and filebeat through a docker-compose file so that when it deployed it start taking application logs? Any reference also will be helpful.
How to deploy logstash in kubernetes cluster and how to get it to communicate with filebeat? Any reference will be helpful.
to your second question....you can use helm to install logstash on your kubernetes cluster.
This is the link to the logstash helm chart https://github.com/helm/charts/tree/master/stable/logstash and this is the link to the helm k8s package manager for installation/usage details https://helm.sh/ .
With this helm command I install logstash on my kubernetes cluster helm install stable/logstash --name logstash --namespace logging -f logstash_custom_values.yml --version 1.4.2
You also has to tell filebeat (filebeat.yml) where it can find logstash.
Thanks for the answer. I will look into that. Meanwhile, can you tell me the process of putting application logs into filebeat if both application and filebeat are running in different containers? This is in context to my first question.
Ohh, I didnt know about this feature of filebeat. Thanks for the answer.
Right now I have a flask application running in a container. I can see the request logs by running the docker log command. So, I wanted the filebeat (running in separate container) to take these logs because the container logs are stored in host os only, Right?
Is this the right approach if not in kubernetes environment? Can I use docker compose for this case?
Also, Where in macOS the logs files of each container are stored?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.