Parsing logs from Dockerized NGinx


I am trying to use the ELK stack to view & index logs from a bunch of services running in docker containers. One such service is NGinX and I would like to start with this.

I have deployed filebeat, logstash, elasticsearch & kibana with docker & docker-compose.
I am successfully harvesting all docker logs using filebeat (which is adding docker metadata), which is forwarding them to logstash (currently not really doing anything) and elasticsearch.

So in Kibana I can see something like this:

What I would like to do is parse and index log messages from the nginx container, which can be identified by name.

My first thought is that I should be using logstash to extract the nginx message from the docker json log and then parse this.

  • Is this possible?
  • Is this correct, or should I be using only filebeat or doing something else?

My filebeat configuration looks like this:

- type: container
  - '/var/lib/docker/containers/*/*.log'

  - add_docker_metadata:
      host: "unix:///var/run/docker.sock"

  hosts: ["logstash:5044"]

And logstash config is currently not very interesting:

input {
	beats {
		port => 5044
      host => ""

## Add your filters / logstash plugins configuration here

output {
	elasticsearch {
		hosts => [ "elasticsearch:9200" ]

I added a grok filter in the logstash.conf:

grok {
match => {"message" => "%{IPORHOST:remote_ip} - %{DATA:user_name} [%{HTTPDATE:access_time}] "%{WORD:http_method} %{DATA:url} HTTP/%{NUMBER:http_version}" %{NUMBER:response_code} %{NUMBER:body_sent_bytes} "%{DATA:referrer}" "%{DATA:agent}""}

It catches access mesasges, but [notice] messages from nginx itself are a different format.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.