ECS Logger Mapping Logs to ECS Fields

I am quite new using Elastic and I have log data from AWS CloudWatch that I have shipped into Elastic Cluster using filebeats however I couldn't understand how I can map my logs into ECS fields. I read a lot of documentation but neither of them shows actually do this. I am using ECS logger however all my logs are sent in messages and I can't aggregate on it. Main question is how can log my messages in a certain format so I can. aggregate and visualize on Kibana?

I watched the webinar above and I didn't understand how he decides to map his data to particular ecs fields. And after that how can we make sure that our log will be mapped to those fields.

I am trying to create a mapping for example,
event.name = postprocessor
event.source = cirus
event.type = success
and etc...

I couldn't understand how to work with ECS logger so that I can visualize this data in Kibana. I would really appreciate the help.

Hi,

You can find overall documentation for ECS here.

The fields of interest in this scenario will be the event fields and log fields.

Adherence to these formats will allow you to view your logs in the Logs UI, but also things like Discover.

Hello Kerry, thank you for your reply!

I read all of those documents, however I couldn't find the part where they explain how to adhere those formats.

Here is my log from ECS-logger that is displayed as only under message field on Kibana
2021-06-07T13:18:47.110Z 2021-06-07 13:18.47 [info ] {'labels': {'scan.category': 'APP_SECURTY_CATEGORY', 'scan.type': 'VULNCHECK', 'scan.outcome': 'SUCCESSFULL', 'scan.reason': 'NO_ERROR'}, 'message': 'Mertay Message', 'event': {}}

 logger = logging.getLogger("ecs-logger")
 logger.setLevel(logging.INFO)

 # Add an ECS formatter to the Handler
 handler = logging.StreamHandler()
 formatter = ecs_logging.StdlibFormatter()
 handler.setFormatter(formatter)
 logger.addHandler(handler)

 my_dict = { "labels": {
     "scan.category": "APP_SECURTY_CATEGORY",
     "scan.type": "VULNCHECK",
     "scan.outcome": "SUCCESSFULL",
     "scan.reason": "NO_ERROR",
      },
 "message": "Mertay Message",
 "event": { }
 }

 logger.info(my_dict)

Could you help me identify what I am doing wrong? Because I am sure this problem that I am having is very fundamental, I just don't have enough experience and couldn't find any guidance on how do I create custom fields and log them properly. I have read the article about how to create custom fields, and using their scheme but it doesn't help either. I really appreciate the help! Thank you!

Hi, @mertayd!

Using ecs-logging with the logging module, you need to put your additional fields into the extra attribute.

my_dict = { "labels": {
     "scan.category": "APP_SECURTY_CATEGORY",
     "scan.type": "VULNCHECK",
     "scan.outcome": "SUCCESSFULL",
     "scan.reason": "NO_ERROR",
      },
 "event": { }
 }

 logger.info("Mertay Message", extra=my_dict)

Some more detail on using ecs-logging-python with the logging module is covered here: Installation | ECS Logging Python Reference | Elastic

A behavior I noted while testing this out with your example. The dot (.) character in key names will be auto-expanded.

For example, the "scan.category": "APP_SECURITY_CATEGORY" field under labels became:

  "labels": {
    "scan": {
      "category": "APP_SECURTY_CATEGORY"
    }
  }

If you're following the ECS spec for the labels field, you will need to avoid using nested objects underneath labels.

Thank you for your prompt response @ebeahan !

However, even though I made the changes you suggested, I still see those fields in Elastic under message field.

Here is the updated snippet,

my_dict = { 
    "scan": {
        "category": "APP_SECURTY_CATEGORY",
        "type": "VULNCHECK",
        "outcome": "SUCCESSFULL",
        "reason": "NO_ERROR"
    }
}
def main(event, context):
    logger.info("My FIRST CORRECT MESSAGE!!!", extra=my_dict)
    return "Done"

Elastic Log

I want to see scan field separately so I can visualize it on Kibana. That is my main problem. I hope you can help. Thank you in advance!

I still need help here

Hi @mertayd,

from your screenshot it looks like your logged data are serialized into aws.cloudwatch.message as JSON on their way through cloudwatch. To deserialize the data you can use the decode_json_fields processor in your aws module configuration such that it contains something like the following:

- module: aws
  # ... other aws filesets ...

  cloudwatch:
    enabled: true
    # ... other cloudwatch config ...

    input:
      processors:
        - decode_json_fields:
            fields: ["aws.cloudwatch.message"] # the field to parse
            target: "" # write to root of document
            overwrite_keys: true # might be needed to override @timestamp and similar fields

I couldn't test this exact setup, but maybe it unblocks you in figuring it out. :crossed_fingers:

(edited: fix typo)
(edited: add overwrite_keys)

1 Like

And another thought: Since you're adding your custom fields (such as scan) I'd recommend to enhance the index template to include mappings for these. Otherwise the dynamic mapping might cause the scan.* fields to be interpreted in a way that hinders querying.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.