Logstash parse docker container log

Hi,

Our API is running by container and the log of our API inside container have format like this:

{"log":"{"fields.time":"2019-02-16T02:53:00.606890428Z","ip":"xx.xxx.xxx.xxx","latency":303805427,"level":"info","method":"POST","msg":"","path":"/v1/contacts","status":200,"time":"2019-02-16T02:53:00Z","user-agent":"okhttp/3.11.0"}\n","stream":"stderr","time":"2019-02-16T02:53:00.607869346Z"}

I used filebeat to send these log to our logstash to parse to separate fields like: logtime = fields.time , ip, latency, status, .... (all of nested in log object)

here is my filebeat config:

  • type: log
    paths:
    # Docker
    - /var/lib/docker/containers//-json.log
    json.message_key: log
    json.keys_under_root: true
    fields:
    log_type: "docker"

Here is my logstash

input {
beats {
port => 5044
}
}

filter {

if [fields][log_type] == "docker" {
    grok {
        match => { "message" => "%{GREEDYDATA}" }
        add_field => [ "received_at", "%{@timestamp}" ]
        add_field => [ "received_from", "%{host}" ]
    }

 mutate {
    add_field => {"logtime" => "%{[log][fields.time]}"}
    add_field => {"logtime" => "%{[log][ip]}"}
 }
}

}

output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["es_host:9200"]
manage_template => false
index => "test-%{+YYYY.MM.dd}"
}
}

But I did not work as I expect.
I want from Kibana I can see separate fields: logtime, ip, status,...

The grok filter does not make any sense. It tries to match message to GREEDYDATA, which will always match, but does not extract any fields.

If the input is json then you should use a json filter to parse it.

json { source => "message" }

This will cause logtime to be an array with those two entries. Probably not what you want.

add_field => {"logtime" => "%{[log][fields.time]}"}
add_field => {"logtime" => "%{[log][ip]}"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.