Hi
I am trying to use the ELK stack to view & index logs from a bunch of services running in docker containers. One such service is NGinX and I would like to start with this.
I have deployed filebeat, logstash, elasticsearch & kibana with docker & docker-compose.
I am successfully harvesting all docker logs using filebeat (which is adding docker metadata), which is forwarding them to logstash (currently not really doing anything) and elasticsearch.
So in Kibana I can see something like this:
What I would like to do is parse and index log messages from the nginx container, which can be identified by name.
My first thought is that I should be using logstash to extract the nginx message from the docker json log and then parse this.
- Is this possible?
- Is this correct, or should I be using only filebeat or doing something else?
My filebeat configuration looks like this:
filebeat.inputs:
- type: container
paths:
- '/var/lib/docker/containers/*/*.log'
processors:
- add_docker_metadata:
host: "unix:///var/run/docker.sock"
output.logstash:
hosts: ["logstash:5044"]
And logstash config is currently not very interesting:
input {
beats {
port => 5044
host => "0.0.0.0"
}
}
## Add your filters / logstash plugins configuration here
output {
elasticsearch {
hosts => [ "elasticsearch:9200" ]
}
}