Configuring the collection of ELK stack logs using filebeat

I deployed the stack in docker and ran into a log parsing problem.
About the configuration:
Docker - uses the default logging plugin - json. A label has also been added to the container - co.elastic.logs/module=kibana

On the filebeat side:

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

In the end, everything depends on the processing of ingest pipelines, it does not find the necessary fields, an example of a log:

{
  "@timestamp": "2025-05-21T11:51:15.472Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "8.7.1"
  },
  "log": {
    "offset": 363231,
    "file": {
      "path": "/var/lib/docker/containers/e746aeab9bdea8b2c0e4aa4514aadcfc988c1b8b591ff37029e374066ab97637/e746aeab9bdea8b2c0e4aa4514aadcfc988c1b8b591ff37029e374066ab97637-json.log"
    }
  },
  "message": "{\"log\":\"[2025-05-21T11:23:52.842+00:00][INFO ][savedobjects-service] [.kibana_alerting_cases] Migration completed after 97ms\\n\",\"stream\":\"stdout\",\"time\":\"2025-05-21T11:23:52.842838561Z\"}",
  "input": {
    "type": "filestream"
  },
  "ecs": {
    "version": "8.0.0"
  },
  "host": {
    "ip": [
      "192.123.123.123"
    ],
    "mac": [
      "00-50-56-18-2A-49",
      "1E-60-59-5D-36-D7",
      "D6-B1-C9-72-A3-A1"
    ],
    "hostname": "lsg-kbn",
    "architecture": "x86_64",
    "os": {
      "version": "",
      "family": "debian",
      "name": "Debian GNU/Linux",
      "kernel": "6.1.0-33-amd64",
      "codename": "trixie",
      "type": "linux",
      "platform": "debian"
    },
    "name": "lsg-kbn",
    "id": "c20ef26f853d47e289c31ab34c83148d",
    "containerized": false
  },
  "agent": {
    "ephemeral_id": "67e015bf-628b-4ae3-a962-582e11d7b229",
    "id": "f042e87f-79ec-4082-9228-96633818c1cf",
    "name": "lsg-kbn",
    "type": "filebeat",
    "version": "8.7.1"
  },
  "container": {
    "labels": {
      "org_label-schema_version": "8.13.4",
      "com_docker_compose_project_config_files": "/u/kibana/docker-compose.yaml",
      "org_opencontainers_image_documentation": "https://www.elastic.co/guide/en/kibana/reference/index.html",
      "com_docker_compose_oneoff": "False",
      "org_opencontainers_image_vendor": "Elastic",
      "co_elastic_logs/module": "kibana",
      "org_opencontainers_image_ref_name": "ubuntu",
      "org_opencontainers_image_licenses": "Elastic License",
      "com_docker_compose_project": "kibana",
      "org_label-schema_license": "Elastic License",
      "org_label-schema_schema-version": "1.0",
      "org_opencontainers_image_version": "8.13.4",
      "org_label-schema_vendor": "Elastic",
      "com_docker_compose_depends_on": "",
      "org_label-schema_vcs-url": "https://github.com/elastic/kibana",
      "com_docker_compose_config-hash": "556af915fce96b58bd7ecad1a142a41ab4a19bb4a68f33caf18a3bf83182ddb5",
      "com_docker_compose_project_working_dir": "/u/kibana",
      "com_docker_compose_image": "sha256:d40f2a312340150aee47a796198457fc83f5e80bd27fe1a2c27a500527438c19",
      "org_opencontainers_image_revision": "f5dc24d1969f80e4aa3ced7cc375dd00554f8c0c",
      "org_label-schema_vcs-ref": "f5dc24d1969f80e4aa3ced7cc375dd00554f8c0c",
      "org_opencontainers_image_created": "2024-05-07T06:06:37.059Z",
      "com_docker_compose_service": "kibana",
      "org_opencontainers_image_source": "https://github.com/elastic/kibana",
      "org_label-schema_usage": "https://www.elastic.co/guide/en/kibana/reference/index.html",
      "com_docker_compose_version": "2.35.1",
      "org_label-schema_url": "https://www.elastic.co/products/kibana",
      "org_opencontainers_image_title": "Kibana",
      "org_label-schema_build-date": "2024-05-07T06:06:37.059Z",
      "org_label-schema_name": "Kibana",
      "com_docker_compose_container-number": "1",
      "org_opencontainers_image_url": "https://www.elastic.co/products/kibana"
    },
    "image": {
      "name": "kibana:8.13.4"
    },
    "name": "kibana",
    "id": "e746aeab9bdea8b2c0e4aa4514aadcfc988c1b8b591ff37029e374066ab97637"
  },
  "data_stream": {
    "dataset": "filebeat",
    "type": "logs",
    "namespace": "default"
  }
}

The pipeline filebeat-8.7.1-kibana-log-pipeline-7 error occurs at the Rename stage (Renames "json" to "kibana.log.meta")because there is no json field.

Pipines settings:

[
  {
    "set": {
      "value": "{{_ingest.timestamp}}",
      "field": "event.ingested"
    }
  },
  {
    "set": {
      "field": "event.created",
      "copy_from": "@timestamp"
    }
  },
  {
    "rename": {
      "field": "json",
      "target_field": "kibana.log.meta"
    }
  },
  {
    "date": {
      "formats": [
        "ISO8601"
      ],
      "target_field": "@timestamp",
      "field": "kibana.log.meta.@timestamp"
    }
  },
  {
    "remove": {
      "field": "kibana.log.meta.@timestamp"
    }
  },
  {
    "remove": {
      "field": "message"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.message",
      "target_field": "message"
    }
  },
  {
    "rename": {
      "ignore_missing": true,
      "field": "kibana.log.meta.state",
      "target_field": "kibana.log.state"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.pid",
      "target_field": "process.pid"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.tags",
      "target_field": "kibana.log.tags"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.res.statusCode",
      "target_field": "http.response.status_code",
      "ignore_missing": true
    }
  },
  {
    "script": {
      "lang": "painless",
      "source": "ctx.event.duration = Math.round(ctx.kibana.log.meta.res.responseTime * 1000000L)",
      "if": "ctx?.kibana?.log?.meta?.res?.responseTime != null"
    }
  },
  {
    "remove": {
      "field": "kibana.log.meta.res.responseTime",
      "ignore_missing": true
    }
  },
  {
    "rename": {
      "target_field": "http.response.body.bytes",
      "ignore_missing": true,
      "field": "kibana.log.meta.res.contentLength"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.req.method",
      "target_field": "http.request.method",
      "ignore_missing": true
    }
  },
  {
    "rename": {
      "target_field": "http.request.referrer",
      "ignore_missing": true,
      "field": "kibana.log.meta.req.headers.referer"
    }
  },
  {
    "rename": {
      "ignore_missing": true,
      "field": "kibana.log.meta.req.headers.user-agent",
      "target_field": "user_agent.original"
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.req.remoteAddress",
      "target_field": "source.address",
      "ignore_missing": true
    }
  },
  {
    "set": {
      "field": "source.ip",
      "value": "{{source.address}}",
      "ignore_empty_value": true
    }
  },
  {
    "rename": {
      "field": "kibana.log.meta.req.url",
      "target_field": "url.original",
      "ignore_missing": true
    }
  },
  {
    "remove": {
      "field": "kibana.log.meta.req.referer",
      "ignore_missing": true
    }
  },
  {
    "remove": {
      "field": "kibana.log.meta.statusCode",
      "ignore_missing": true
    }
  },
  {
    "remove": {
      "field": "kibana.log.meta.method",
      "ignore_missing": true
    }
  },
  {
    "append": {
      "field": "service.name",
      "value": "kibana"
    }
  },
  {
    "set": {
      "field": "event.kind",
      "value": "event"
    }
  },
  {
    "script": {
      "lang": "painless",
      "source": "if (ctx?.kibana?.log?.state != null) {\n  if (ctx.kibana.log.state == \"red\") {\n    ctx.event.type = \"error\";\n  } else {\n    ctx.event.type = \"info\";\n  }\n}"
    }
  },
  {
    "set": {
      "field": "event.outcome",
      "value": "success",
      "if": "ctx?.http?.response?.status_code != null && ctx.http.response.status_code < 400"
    }
  },
  {
    "set": {
      "value": "failure",
      "if": "ctx?.http?.response?.status_code != null && ctx.http.response.status_code >= 400",
      "field": "event.outcome"
    }
  }
]

Have you ever encountered such a problem and how was it solved ?

Hi,

It looks like the problem here is that docker log hasn't been parsed into a json object. It only exists as a string in the "message": "message": "{\"log\":\"[2025-05-21T11:23:52.842+00:00][INFO ]....

You'll need to parse the message into json before its renamed and further processed in the ingest pipeline. This can be done either on the filebeat side or in the ingest pipeline.

In filebeat, you can add a processor to do it:

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true
processors:
  - decode_json_fields:
      fields: ["message"]
      target: "json"
      overwrite_keys: true

or in the ingest pipeline, before the rename processor, add a json processor:

{
  "json": {
    "field": "message",
    "target_field": "json"
  }
}

You only need to convert to a json object in one place.

Thank you for your reply. but unfortunately this solution is not suitable, as there are other strings that cannot be decompressed as json. For example:

{
  "@timestamp": "2025-05-22T07:42:29.505Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "8.7.1",
    "pipeline": "filebeat-8.7.1-kibana-log-pipeline"
  },
  "docker": {
    "container": {
      "labels": {
        "org_label-schema_license": "Elastic License",
        "org_label-schema_usage": "https://www.elastic.co/guide/en/kibana/reference/index.html",
        "com_docker_compose_image": "sha256:d40f2a312340150aee47a796198457fc83f5e80bd27fe1a2c27a500527438c19",
        "org_opencontainers_image_licenses": "Elastic License",
        "org_opencontainers_image_ref_name": "ubuntu",
        "com_docker_compose_depends_on": "",
        "org_label-schema_vendor": "Elastic",
        "org_label-schema_version": "8.13.4",
        "org_opencontainers_image_documentation": "https://www.elastic.co/guide/en/kibana/reference/index.html",
        "org_opencontainers_image_title": "Kibana",
        "com_docker_compose_project_config_files": "/u/kibana/docker-compose.yaml",
        "org_opencontainers_image_url": "https://www.elastic.co/products/kibana",
        "org_opencontainers_image_created": "2024-05-07T06:06:37.059Z",
        "org_label-schema_schema-version": "1.0",
        "com_docker_compose_config-hash": "23384bf50a20d293a0c27ba87e7d998134acd0c58d0517594e0279dc834b5042",
        "co_elastic_logs/module": "kibana",
        "com_docker_compose_project": "kibana",
        "org_label-schema_vcs-ref": "f5dc24d1969f80e4aa3ced7cc375dd00554f8c0c",
        "org_opencontainers_image_vendor": "Elastic",
        "org_opencontainers_image_revision": "f5dc24d1969f80e4aa3ced7cc375dd00554f8c0c",
        "org_opencontainers_image_version": "8.13.4",
        "org_label-schema_url": "https://www.elastic.co/products/kibana",
        "com_docker_compose_oneoff": "False",
        "org_label-schema_build-date": "2024-05-07T06:06:37.059Z",
        "com_docker_compose_container-number": "1",
        "com_docker_compose_version": "2.35.1",
        "org_opencontainers_image_source": "https://github.com/elastic/kibana",
        "com_docker_compose_service": "kibana",
        "com_docker_compose_project_working_dir": "/u/kibana",
        "org_label-schema_name": "Kibana",
        "org_label-schema_vcs-url": "https://github.com/elastic/kibana"
      }
    }
  },
  "agent": {
    "type": "filebeat",
    "version": "8.7.1",
    "ephemeral_id": "60805bbc-b2df-4673-afc7-1d175c2e2d6a",
    "id": "f042e87f-79ec-4082-9228-96633818c1cf",
    "name": "lsg-kbn"
  },
  "data_stream": {
    "dataset": "filebeat",
    "type": "logs",
    "namespace": "kibana"
  },
  "stream": "stdout",
  "message": "[2025-05-22T07:42:29.505+00:00][INFO ][plugins.securitySolution.endpoint:user-artifact-packager:1.0.0] Complete. Task run took 4ms [ stated: 2025-05-22T07:42:29.501Z ]",
  "fileset": {
    "name": "log"
  },
  "service": {
    "type": "kibana"
  },
  "ecs": {
    "version": "1.12.0"
  },
  "host": {
    "architecture": "x86_64",
    "os": {
      "version": "",
      "family": "debian",
      "name": "Debian GNU/Linux",
      "kernel": "6.1.0-33-amd64",
      "codename": "trixie",
      "type": "linux",
      "platform": "debian"
    },
    "id": "c20ef26f853d47e289c31ab34c83148d",
    "containerized": false,
    "name": "lsg-kbn",
    "ip": [
      "192.168.113.113"
    ],
    "mac": [
      "00-50-56-18-2A-49",
      "1E-60-59-5D-36-D7",
      "D6-B1-C9-72-A3-A1"
    ],
    "hostname": "lsg-kbn"
  },
  "log": {
    "offset": 71415,
    "file": {
      "path": "/var/lib/docker/containers/c06ec0dd6a9b54136f8221ebe6668d488b251036ff3d60b32136a37d33253bd9/c06ec0dd6a9b54136f8221ebe6668d488b251036ff3d60b32136a37d33253bd9-json.log"
    }
  },
  "input": {
    "type": "container"
  },
  "event": {
    "module": "kibana",
    "dataset": "kibana.log"
  },
  "container": {
    "id": "c06ec0dd6a9b54136f8221ebe6668d488b251036ff3d60b32136a37d33253bd9",
    "name": "kibana",
    "image": {
      "name": "kibana:8.13.4"
    }
  }
}

Will cause an error :

{
  "docs": [
    {
      "error": {
        "root_cause": [
          {
            "type": "x_content_parse_exception",
            "reason": "[1:7] Unexpected character ('-' (code 45)): was expecting comma to separate Array entries\n at [Source: (String)\"[2025-05-22T07:42:29.505+00:00][INFO ][plugins.securitySolution.endpoint:user-artifact-packager:1.0.0] Complete. Task run took 4ms [ stated: 2025-05-22T07:42:29.501Z ]\"; line: 1, column: 7]"
          }
        ],
        "type": "x_content_parse_exception",
        "reason": "[1:7] Unexpected character ('-' (code 45)): was expecting comma to separate Array entries\n at [Source: (String)\"[2025-05-22T07:42:29.505+00:00][INFO ][plugins.securitySolution.endpoint:user-artifact-packager:1.0.0] Complete. Task run took 4ms [ stated: 2025-05-22T07:42:29.501Z ]\"; line: 1, column: 7]",
        "caused_by": {
          "type": "json_parse_exception",
          "reason": "Unexpected character ('-' (code 45)): was expecting comma to separate Array entries\n at [Source: (String)\"[2025-05-22T07:42:29.505+00:00][INFO ][plugins.securitySolution.endpoint:user-artifact-packager:1.0.0] Complete. Task run took 4ms [ stated: 2025-05-22T07:42:29.501Z ]\"; line: 1, column: 7]"
        }
      }
    }
  ]
}

It seems strange to me that there is no ready-made solution for the ELK stack when using docker. I suppose I might have missed something, is there probably a standard solution?

I have found workarounds for each stack element. For Kibana, it is the removal of logs and their monitoring . For Logstash, edit the file with the logging settings and also transfer the file to the host machine. For elasticsearch, scrape via filestream and using parsers. But it's also not a good solution, but workarounds. It's a pity that it doesn't seem to have been provided for.