Docker gelf and logstash to logstash

Hello guys
Im using the dock gelf logging driver and I have the gelf input plugin and elastic search as output this works fine.

What I want to is is
docker -> gelf_input -> lumberjack_out -> lunberjack_in -> es

the problem is that all the goodies like container name and image id and so are lost in the logstash to logstash.

This is the proxy sending the logs to the other logstash machine (here I can access container_name field for example)

input {
  gelf {
    type => docker
    port => 12201
  }
}
output {
  lumberjack {
    hosts => ["someremotehost.com"]
    port => 5000
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
  }
}

The receiving machine has this (here I only get the message field (I could merge and split but maybe there is better solution)

input {
  lumberjack {
    port => 5000
    type => "docker"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}

Any ideas about how to persist the docker fields?

Thanks

What do the original events look like? What do they look like after the lumberjack back and forth? I suspect you'll want to use the json codec for both lumberjack plugins.

Original
{ "_index": "logstash-2016.06.14", "_type": "docker", "_id": "AVVOP87XaNgHqqvkRf_V", "_score": null, "_source": { "version": "1.1", "host": "hamakabi", "level": 6, "@version": "1", "@timestamp": "2016-06-14T09:30:51.854Z", "source_host": "172.17.0.1", "message": "2016-06-14 09:30:50 UTC LOG: incomplete startup packet", "command": "/bin/sh -c rm -f /var/log/postgresql/postgresql-9.3-main.log \t&& service postgresql start \t&& tail -f /var/log/postgresql/postgresql-9.3-main.log", "container_id": "2397fda5076279766a84950b0760844b5470259cae", "container_name": "dbgelf", "created": "2016-06-14T09:30:47.481540705Z", "image_id": "sha256:b4dc0e3d48524c6908170394e", "image_name": "pgsql:dev", "tag": "", "type": "docker" }, "fields": { "created": [ 1465896647481 ], "@timestamp": [ 1465896651854 ] }, "sort": [ 1465896651854 ] }

Here as you can see all the docker information is gone (I removed some sensitive fields) but I think you get the idea.
{ "_index": "logstash-2016.06.20", "_id": "AVVtS1kxrHGcnCLe_BTu", "_score": null, "_source": { "message": "a message", "@version": "1", "@timestamp": "2016-06-20T10:11:42.389Z", "host": "staging", "offset": "158221" }, "fields": { "@timestamp": [ 1466417502389 ] }, "sort": [ 1466417502389 ] }

So your idea is to pack the message in json and then unpack it? can this be done in a quick WAY? right now I just did a merge off docker fields in the message and the I do a split in the other side. I think that json would be more elegant.

Add codec => json to the lumberjack input and output plugins.

1 Like

this totally did the trick!
thank you very much