Hello,
I'm using an ELK stack with filebeat to ship JSON logs from docker containers (kubernetes) to elastic. The problem I'm having is that large log entries seem to be skipped. Is there a setting to change to allow this to pick up ~200kb+ json objects to log?
filebeat.yml: |-
filebeat.config:
inputs:
# Mounted `filebeat-inputs` configmap:
path: ${path.config}/inputs.d/*.yml
# Reload inputs configs as they change:
reload.enabled: false
modules:
path: ${path.config}/modules.d/*.yml
# Reload module configs as they change:
reload.enabled: false
processors:
- add_cloud_metadata: ~
- decode_json_fields:
when.regexp.message: '^{'
fields: ["message"]
target: ""
overwrite_keys: true
- drop_event:
when.not.contains.message: "\"cid\":"
output.elasticsearch:
hosts: ["loggingelastic:9200"]