Filebeat 7.4.0 Envoyproxy configuration issues

This is in continuation to the following issue I raised a while ago - Filebeat 7.2.0 autodiscover configuration issues

I migrated to elk stack 7.4.0 and trying to ship logs from the Kubernetes cluster. Here's the config file I am using -

apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-config
  namespace: kube-system
  labels:
    k8s-app: filebeat
data:
  filebeat.yml: |-
    setup.dashboards.enabled: true
    setup.template.enabled: true
    setup.kibana.host: "https://7da093603daa49c98c1d2f7909881417.ap-southeast-2.aws.found.io:9243"

    setup.template.settings:
      index.number_of_shards: 1
    setup.template.overwrite: true
    filebeat.overwrite_pipelines: true

    filebeat.config:
      modules:
        path: ${path.config}/modules.d/*.yml
        # Reload module configs as they change:
        reload.enabled: false

    filebeat.autodiscover:
      providers:
        - type: kubernetes
          templates:
            # condition to ship access logs from istio sidecars
            - condition.and:
                - contains:
                    kubernetes.container.name: istio-proxy
                - contains:
                    kubernetes.labels.app: swift-client
              config:
                - module: envoyproxy
                  log:
                    input:
                      type: docker
                      containers:
                        stream: stdout
                        ids:
                          - ${data.kubernetes.container.id}
              # config:
              #   - type: docker
              #     containers:
              #       stream: stdout
              #       ids:
              #         - ${data.kubernetes.container.id}
            - condition.and:
                - contains:
                    kubernetes.container.name: istio-proxy
                - contains:
                    kubernetes.labels.app: swift-server
              config:
                - type: docker
                  containers:
                    stream: stdout
                    ids:
                      - ${data.kubernetes.container.id}

            # condition for application logs
            - condition.and:
                - contains:
                    kubernetes.container.name: swift-client
                - contains:
                    kubernetes.labels.app: swift-client
              config:
                - type: docker
                  multiline.pattern: '^WARN|^INFO|^DEBUG|^ERROR|^Error|^TypeError'
                  multiline.negate: true
                  multiline.match: after
                  containers:
                    ids:
                      - ${data.kubernetes.container.id}
            - condition.and:
                - contains:
                    kubernetes.container.name: swift-server
                - contains:
                    kubernetes.labels.app: swift-server
              config:
                - type: docker
                  multiline.pattern: '^WARN|^INFO|^DEBUG|^ERROR|^Error|^TypeError'
                  multiline.negate: true
                  multiline.match: after
                  containers:
                    ids:
                      - ${data.kubernetes.container.id}

    processors:
      - add_cloud_metadata: ~
      - add_kubernetes_metadata:
          in_cluster: true

    output.elasticsearch:
      hosts: ['${ELASTICSEARCH_HOST:elasticsearch}:${ELASTICSEARCH_PORT:9243}']
      username: ${ELASTICSEARCH_USERNAME}
      password: ${ELASTICSEARCH_PASSWORD}

I see following errors on filebeat pods -

2019-10-03T11:26:24.867Z WARN elasticsearch/client.go:535 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0x49f25b7, ext:63705698782, loc:(*time.Location)(nil)}, Meta:common.MapStr{"pipeline":"filebeat-7.4.0-envoyproxy-log-pipeline-entry"}, Fields:common.MapStr{"agent":common.MapStr{"ephemeral_id":"9e8b613c-573e-491f-8509-b7cbaf645105", "hostname":"filebeat-9dkl7", "id":"7ecf8182-20e6-43f8-a128-6fbff9b22005", "type":"filebeat", "version":"7.4.0"}, "cloud":common.MapStr{"account":common.MapStr{"id":"845778257277"}, "availability_zone":"ap-southeast-2a", "image":common.MapStr{"id":"ami-05b461c33a8e8b5b4"}, "instance":common.MapStr{"id":"i-0f4e724b21695c44a"}, "machine":common.MapStr{"type":"m5.large"}, "provider":"aws", "region":"ap-southeast-2"}, "ecs":common.MapStr{"version":"1.1.0"}, "event":common.MapStr{"dataset":"envoyproxy.log", "module":"envoyproxy"}, "fileset":common.MapStr{"name":"log"}, "host":common.MapStr{"name":"filebeat-9dkl7"}, "input":common.MapStr{"type":"docker"}, "kubernetes":common.MapStr{"container":common.MapStr{"image":"istio/proxyv2:1.2.2", "name":"istio-proxy"}, "labels":common.MapStr{"app":"swift-client", "pod-template-hash":"654b75cff6", "version":"e0f3add"}, "namespace":"ratecity", "node":common.MapStr{"name":"ip-192-168-18-219.ap-southeast-2.compute.internal"}, "pod":common.MapStr{"name":"swift-client-e0f3add-654b75cff6-qhx7g", "uid":"de5260d2-e5a3-11e9-a5b2-025a63f6c824"}, "replicaset":common.MapStr{"name":"swift-client-e0f3add-654b75cff6"}}, "log":common.MapStr{"file":common.MapStr{"path":"/var/lib/docker/containers/f8d3161ba65c6ff66eb1ae44d361c2dd8f77ab788209f0e03633ed5ef1cb942c/f8d3161ba65c6ff66eb1ae44d361c2dd8f77ab788209f0e03633ed5ef1cb942c-json.log"}, "offset":5707065}, "message":"[2019-10-03T11:26:18.269Z] \"- - -\" 0 - \"-\" \"-\" 13928 3059 1527 - \"-\" \"-\" \"-\" \"-\" \"162.247.242.27:443\" PassthroughCluster 192.168.16.140:53900 162.247.242.27:443 192.168.16.140:53898 -", "service":common.MapStr{"type":"envoyproxy"}, "stream":"stdout", "tags":[]string{"envoyproxy"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc0007868f0), Source:"/var/lib/docker/containers/f8d3161ba65c6ff66eb1ae44d361c2dd8f77ab788209f0e03633ed5ef1cb942c/f8d3161ba65c6ff66eb1ae44d361c2dd8f77ab788209f0e03633ed5ef1cb942c-json.log", Offset:5707335, Timestamp:time.Time{wall:0xbf5d94712b9da100, ext:55351681075, loc:(*time.Location)(0x4de3580)}, TTL:-1, Type:"docker", Meta:map[string]string{"stream":"stdout"}, FileStateOS:file.StateOS{Inode:0x42056d8, Device:0x10301}}, TimeSeries:false}, Flags:0x1} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [destination.bytes] of type [long] in document with id 'xdVekW0BOoxUBKT2_swG'. Preview of field's value: '\"-\"'","caused_by":{"type":"illegal_argument_exception","reason":"For input string: \"\"-\"\""}}

The dashboards on Kibana are broken. Any help is much appreciated.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.