Problem with JSON logs


#1

Hi,

I'm new to the Elastic Stack and I can't manage to build a pipeline for my Java logs.
I use the Logstack Logstash appender, Filebeat and Elastic Search for now.

Here is my Filebeat configuration.

    filebeat.inputs:
    - type: log
      enabled: true
      paths:
        - /var/log/myApp/stash_*
      fields_under_root: true
      fields:
        app_id: myApp
        env: dev
    output.elasticsearch:
      hosts: ["http://MyES:9200"]

It does generate entries in ES:

"@timestamp": "2019-02-08T19:52:36.371Z",
"app_id": "myApp",
...
"message": "{"@timestamp":"2019-02-08T20:52:36.329+01:00","@version":"1","message":"[Application] At path /data/","logger_name":"application","thread_name":"application-akka.actor.default-dispatcher-155","level":"DEBUG","level_value":10000}",
...

As you can see, the message is stored as plain text instead of JSON.

What am I missing ?

Thanks in advance


(Andrew Kroh) #2

You need to configure log input to decode JSON. See https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-input-log.html#filebeat-input-log-config-json.

filebeat.inputs:
- type: log
  json.message_key: message
  ...

#3

Of course !
Working great, thanks a lot ! :slight_smile: