I've put in about 6 hours trying to troubleshoot this whole deployment and keep running in circles. I have some containers running in Docker and I've deployed Filebeat-oss:7.5.2 to forward the docker logs.
My containers write json formatted log entries to stdout which Docker then writes to a log file as a string stored within it's own json formatted log entry. Here is an example of a log entry docker writes to disk where my application's message is contained within the first "log" key:
{"log":"{\"message\": \"Received task: tasks.scrape_prices[3bc29ecf-4dcc-4317-b690-6aa47bb0761b] \"}\n","stream":"stdout","time":"2020-01-24T01:32:14.6139176Z"}
{"log":"{\"message\": \"Starting Price Scraper...\", \"task_id\": \"3bc29ecf-4dcc-4317-b690-6aa47bb0761b\", \"task_name\": \"scrape_prices\", \"timestamp\": 1579829534.6769736, \"logger\": \"tasks\", \"level\": \"info\"}\n","stream":"stdout","time":"2020-01-24T01:32:14.6776285Z"}
{"log":"{\"message\": \"Web driver was terminated.\", \"task_id\": \"3bc29ecf-4dcc-4317-b690-6aa47bb0761b\", \"task_name\": \"scrape_prices\", \"logger\": \"ProductScraper\", \"level\": \"info\", \"timestamp\": 1579829534.7955523}\n","stream":"stdout","time":"2020-01-24T01:32:14.796053Z"}
I think I've configured filebeat to look into the "log" key and parse json but it doesn't seem to be doing so. Here is my filebeat.yml file that is built for the filebeat image in Docker. (Some config has been removed regarding Elasticsearch output)
filebeat.inputs:
- type: docker
enabled: true
containers:
stream: all # can be all, stdout or stderr
ids:
- '*'
filebeat.autodiscover:
providers:
- type: docker
hints.enabled: true
processors:
- add_docker_metadata: ~
- add_locale:
format: offset
- decode_json_fields:
fields: ["log"]
target: "jslog"
max_depth: 4
process_array: true
add_error_key: true
logging.level: debug
...(truncated)...
Here is part of the docker-compose file that deploys this filebeat service:
version: "3.7"
services:
... (truncated) ...
filebeat:
build:
context: .
dockerfile: Dockerfile-filebeat
user: root
volumes:
- filebeat:/usr/share/filebeat/data
- /var/run/docker.sock:/var/run/docker.sock
- /var/lib/docker/containers/:/var/lib/docker/containers/:ro
# disable strict permission checks
command: ["--strict.perms=false"]
volumes:
filebeat:
This populated Elasticsearch/Kibana but the "log" part of the message is not decoded as expected and displays as a string.
- Is the 'decode_json_fields' processor supposed to work the way I think it should? Should it be breaking 'message' into multiple keys in this record/document?
- Does the OSS version of filebeat even support this processor?