I have configured filebeat v7.1.0 to autodiscover a particular docker container and send the docker logs to logstash which inturn sends them to elastic search. We are using docker images. When we start the containers it sends the docker logs to logstash properly and same can be seen in logstash docker logs (unfortunately filebeat docker logs is not giving anything) The problem is that sometimes logs stop flowing into logstash and if we re-start filebeat container, it starts pushing the docker container logs again.
Here is the filebeat config yml file entries:
filebeat.config:
modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
filebeat.autodiscover:
  providers:
    - type: docker
      templates:
          - condition:
              contains:
                docker.container.name: container_name
            config:
              - type: docker
                containers.ids:
                  - "${data.docker.container.id}"
                scan_frequency: 10s
processors:
  - add_cloud_metadata: ~
output.logstash:
  hosts: '${LOGSTASH_HOSTS:logstash:5044}'
Here is the logstash conf yml file entries
input {
  beats {
    port => 5044
  }
}
filter{
        mutate
                {
                        replace => [ "message", "%{message}" ]
                        gsub => [ 'message','\r','']
                }
        if [message] =~ "bson-start" {
                grok {
                        match => [ "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}):%{LOGLEVEL:level} .* bson-start +(?<SisCompleteMessage>.*) bson-end"
             ]
                }
                if [SisCompleteMessage] =~ "instructionId" {
                        json{
                                source => "SisCompleteMessage"
                        }
                }
        }
}
output {
 stdout {
    codec => rubydebug
  }
  elasticsearch {
    hosts => ["elasticsearch:9200"]
  }
}