How to parse docker logs (nested serialized JSON)


(Chris Hunter) #1

Hello,
I have an AWS Beanstalk environment with docker and I'm sending different logs to logstash, however I'm finding problems parsing the /var/lib/docker/containers/<container_id>/<container_id>-json.log which has the following format:

{"log":"{\"@timestamp\":\"2017-05-16T12:37:22.001+00:00\",\"@version\":1,\"message\":\"Noise noise\",\"logger_name\":\"it.geenee.saf.service.NoiseService\",\"thread_name\":\"pool-2-thread-1\",\"level\":\"INFO\",\"level_value\":20000,\"HOSTNAME\":\"e9d0d4b96320\"}\n","stream":"stdout","time":"2017-05-16T12:37:22.001374657Z"}

With the current setup, logstash is only interpreting the fields "log", "stream" and "time" which is correct, however I'd like logstash to interpret the json data from the "log" field only; I've tried to use the json filter without luck; this is my current setup (without the json filter):

(AWS ES 2.3, logstash 2.3)

logstash.conf:
input {
file {
type => "app"
path => "/var/lib/docker/containers//.log"
codec => json {
charset => "UTF-8"
}
}
file {
type => "system"
path => "/var/log/.log"
start_position => "beginning"
exclude => ["cfn-hup.log", "cfn-wire.log", "boot.log", "yum.log", "eb-activity.log"]
}
file {
type => "eb-activity"
path => "/var/log/eb-activity.log"
start_position => "beginning"
}
file {
type => "nginx"
path => "/var/log/nginx/
.log"
}
}

filter {
mutate {
add_tag => [ "saf"]
}
mutate {
add_field => { "service" => "saf" }
}
if [type] == "nginx" and [path] =~ "access" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
}

output {
amazon_es {
hosts => [ "logs.us-east-1.es.amazonaws.com" ]
region => "us-east-1"
}
}

What is the proper way for doing it?

Thanks!


(Chris Hunter) #2

I finally fixed the issue by adding a json filter specifying the source => "log", that made the trick.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.