Json parsing / containers / kubernetes


I have a nodejs application that is running in Kubernetes, and i have filebeat setup as a DaemonSet - all working pretty well, i'm trying to get the logs from my nodejs app which uses the Winston logger and logs in json format, with the info/error etc log levels being colour coded. When i try to pick the logs up with filebeat i get this error:

{"log":"2020-10-06T12:46:27.387Z\u0009ERROR\u0009readjson/json.go:52\u0009Error decoding JSON: invalid character '\\x1b' looking for beginning of value\n","stream":"stderr","time":"2020-10-06T12:46:27.387887165Z"}

Example log event:

{"log":"\u001b[32minfo\u001b[39m: gel.api.jsonp.middlewares.session.extractor {\"json\":\"{\\\"data\\\":\\\"SESSION NOT FOUND\\\"}\",\"timestamp\":\"2020-10-06T08:46:50.883Z\",\"platformKey\":\"fdddf\",\"threadID\":\"3d0ds3db-7de3-4175-b6ee-3e41f94ce223\",\"app\":\"gel.api\",\"correlationID\":\"no-value\"}\n","stream":"stdout","time":"2020-10-06T08:46:50.88369154Z"}


- type: container
        - /var/log/containers/gel-*api*.log
      json.keys_under_root: true
      json.add_error_key: true
      json.message_key: json
        - add_kubernetes_metadata:
            host: ${NODE_NAME}
            - logs_path:
                logs_path: "/var/log/containers/"
        - add_tags:
            tags: nodejs-logs

I'm thinking maybe strip out the first bits (loglevel, etc) but how? Is there a better way i can handle these events/log entries?

Any help would be appreciated, thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.