Hi,
We are setting up logging for services running on kubernetes. These services generates logs file in the APP_DIR inside a container. How can I ship these logs to ES?
I've refered :
https://www.elastic.co/guide/en/beats/filebeat/current/configuration-autodiscover.html
Deployed filebeat as :
my filebeat configmap :
kind: ConfigMap
metadata:
name: filebeat-config
namespace: kube-system
labels:
k8s-app: filebeat
data:
filebeat.yml: |-
setup.dashboards.enabled: true
setup.template.enabled: true
setup.template.settings:
index.number_of_shards: 1
filebeat.autodiscover:
providers:
- type: kubernetes
templates:
- condition.contains:
kubernetes.container.image: my_app
config:
- type: my_app-logs
paths:
- "/var/www/my_app/app.log"
processors:
- add_cloud_metadata:
cloud.id: ${ELASTIC_CLOUD_ID}
cloud.auth: ${ELASTIC_CLOUD_AUTH}
output.elasticsearch:
hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9200}']
username: ${ELASTICSEARCH_USERNAME}
password: ${ELASTICSEARCH_PASSWORD}
Is it possible to read files inside container using filebeat-daemonset deployment?
Is above approach correct?
Some direction would be helpful.