Elastic serverless forwarder json formatting

we write our logs in json format to cloudwatch.
does elastic serverless forwarder have any way (Deploy Elastic Serverless Forwarder | Elastic Serverless Forwarder Guide | Elastic) to send the logs in a json structure to how it is generated and written in cloudwatch?
the esf is currently taking the json cloudwatch log and putting it into message field.
functionbeat had a decode_json_fields that would break the field into its own json and write to elk.


Hi, we're doing something similar :slight_smile:

Our cloudwatch logs are written from our apps in a JSON structure but each one has a cloudwatch time stamp in front. In order to break up the message field and separate the keys/values we forward to a data stream (which in turn uses a ingest pipeline).

Elastic forwarder config:

  - type: cloudwatch-logs
    id: arn:aws:logs:eu-west-1:awsid:log-group:logs/dev/var/log/web.stdout.log:*
      - type: elasticsearch
          elasticsearch_url: "url"
          api_key: "key"
          es_datastream_name: logs-dev-default

The data stream uses an ingest pipeline by default which looks looks like:

    "grok": {
      "field": "message",
      "patterns": [
    "json": {
      "field": "message",
      "add_to_root": true
    "set": {
      "field": "_index",
      "value": "activity-tracker",
      "if": "ctx.cfg?.identifier == \"uat-log\""

The grok section of the ingest pipeline strips out the cloudwatch data we dont need and adds all the JSON application logs into the message field.

Then we can use the json decode filter as you mentioned to structure the logs.

The set is just used in our use case to send some logs to another index that have a specific identifier instead of the default location this ingest pipeline sends.

Hope this helps!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.