How can I expose new fields sourced from the log record?

Hi all,

I'm using a structured logging package (GitHub - sirupsen/logrus: Structured, pluggable logging for Go.) and then shipping with Filebeat directly to our ES cluster. Unfortunately, I'm losing queryability on the fields I'm exposing since the entire log message gets wrapped in Filebeat's exported "message" field. Can someone point me in the right direction of being able to export these fields to be top-level?

For example, in I'm logging:

log.WithFields(log.Fields{
"event": "click",
"topic": "video",
"key": "123",
}).Fatal("Failed to send event")

But on Kibana, we're reading in log records as such that have a field called 'message' which wraps up the logrus payload:
But once this is propagated to Kibana, that log record is wrapped up:

{"source": "/var/log/app.log",
message: "{"event":"click","topic":"video", "key"="123", "level"="fatal", "msg":"Failed to send event","time":"2015-08-12T18:47:07Z"}"
}

Is it possible to push fields such as 'key', 'event', and 'topic' to be top level so I can query?

It is possible to parse the JSON messages in Filebeat 5.x, but not in Filebeat 1.x. A json option can be specified in the configuration file.

If you are limited to using Filebeat 1.x, then you would need to Logstash to parse the JSON data from the message field. You would configure Filebeat -> Logstash -> Elasticsearch.

Filebeat 5.x configuration:

filebeat:
  prospectors:
    - paths:
        - /var/log/app.log
      json.message_key: msg
      json.keys_under_root: true
      json.add_error_key: true

output:
  console:
    pretty: true
1 Like

Great, thank you @andrewkroh! I'll go ahead and give 5.x a shot.

My setup will be just Filebeat -> Elasticsearch.

This topic was automatically closed after 21 days. New replies are no longer allowed.