Filebeat lost lines from docker json log file

Hi,

I'm using Filebeat to process docker json logs and send them into logstash. My Filebeat config:

   filebeat.inputs:
      - type: docker
        containers.ids: '*'
      processors:
        - add_docker_metadata: ~
        - rename:
            fields:
              - from: fields
                to: fields
              - from: docker.container.labels.com.amazonaws.ecs
                to: ecs
              - from: docker.container.labels.appId
              to: appId
              - from: docker.container.labels.appVersion
                to: appVersion
              - from: docker.container.labels.appAlias
                to: appAlias
            ignore_missing: true
            fail_on_error: false
        - drop_fields:
            fields:
              - docker.container.labels
      close_timeout: 1h
      json.ignore_decoding_error: true
      json.keys_under_root: true
      json.message_key: message
      multiline:
        pattern: '^[[:space:]]+|^Caused by:'
        negate: false
        match: after

    processors:
      - add_cloud_metadata: ~

    output.logstash:
      hosts:
        - logs:5044
      ttl: 60

Docker is configured to use docker-json logging driver and is set:

    max_size: 50m
    max_file: 2

Problem is that containers produce many logs so json log file rotation is often and because of that some logs from containers are not shipped into logstash and are missing.
Can you tell me how can I tune my Filebeat configuration or what I'm doing wrong here so logs are missed?

Thanks everyone for help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.