I'm using filebeat 6.3 to read docker log files with the default docker json-file driver. Lines over 16kb in size are being reported by filebeat as 2 seperate log events.
The symptoms look like the situation the combine_partial[1] option is supposed to address. However, combine_partial is supposed to be true by default and even if I explicitly set it to true, I see the same behavior.
What I'm trying to do is ingest a docker container log where our application has logged a large chunk of JSON. We are using decode_json_fields to parse one of the log fields but since the line gets split, the json doesn't get parsed.
My filebeat configuration is
- type: docker
combine_partial: true
containers:
ids:- '*'
path: ${CONTAINER_LOGS_DIR}
stream: 'all'
json:
keys_under_root: true
overwrite_keys: true
add_error_key: true
message_key: message
tags: - 'filebeat'
fields:
abltools_application: 'filebeat'
abltools_environment: ${ABLTOOLS_ENVIRONMENT}
processors: - add_docker_metadata: ~
- add_host_metadata: ~
- decode_json_fields:
fields:- 'message'
max_depth: 1
target: "json_message"
- 'message'
- '*'
Is there something obvious I'm doing wrong? Or is this a bug?
Thanks for the help!