I'm trying to get filebeat data to elastic. I've followed the instructions and I have filebeat running in a container. I'd ultimately like it to grab stdout from my other containers and push that to elastic.co but first off i just want to see something working.
Right now the container is running and showing stack monitoring from filebeat in the terminal but I am not seeing anything on kibana.
docker-compose.yml
filebeat:
image: docker.elastic.co/beats/filebeat:7.4.1
command:
- "-e"
- "--strict.perms=false"
networks:
- kong-net
volumes:
- "./services/filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml:ro"
# needed to persist filebeat tracking data :
- "filebeat_data:/usr/share/filebeat/data:rw"
# needed to access all docker logs (read only) :
- "/var/lib/docker/devicemapper/devicemapper/data:/usr/share/dockerlogs/data:ro"
# needed to access additional informations about containers
- "/var/run/docker.sock:/var/run/docker.sock"
filebeat.yml
filebeat.inputs:
- type: docker
combine_partial: true
containers:
path: "/usr/share/dockerlogs/data"
stream: "stdout"
ids:
- "*"
exclude_files: ['\.gz$']
ignore_older: 10m
processors: #decode the log field (sub JSON document) if JSON encoded, then maps it's fields to elasticsearch fields
- decode_json_fields:
fields: ["log", "message"]
target: ""
#overwrite existing target elasticsearch fields while decoding json fields
overwrite_keys: true
- add_docker_metadata:
host: "unix:///var/run/docker.sock"
# setup filebeat to send output to logstash output.logstash: hosts: ["logstash"]
cloud.id: "my_testt:secret_key"
cloud.auth: "elastic:secret"
processors:
# decode the log field (sub JSON document) if JSONencoded, then maps it's fields to elasticsearch fields
- decode_json_fields:
fields: ["log"]
target: ""
# overwrite existing target elasticsearch fields while decoding json fields
overwrite_keys: true
- add_docker_metadata: ~