Hello,
I would like to ship all my AWS CloudWatch metrics from one of my AWS accounts to Elasticsearch. I have installed Metricbeat 7.4.2 and configured the aws module’s “cloudwatch” metricset like so:
- module: aws
period: 60s
metricsets:
- cloudwatch
metrics:
- namespace: AWS/DynamoDB
- namespace: AWS/Events
- namespace: AWS/Lambda
- namespace: AWS/Usage
regions:
- eu-west-2
I can see documents arriving in Elasticsearch, but it looks as if Metricbeat is sampling the metrics rather than importing the values verbatim.
For example, one CloudWatch time series contains a datapoint every 120 seconds. Here are some of its datapoints via a CloudWatch GetMetricData call:
"MetricDataResults": [
{
"Id": "id_1",
"Label": "Duration",
"Timestamps": [
"2019-11-28T13:18:00Z",
"2019-11-28T13:16:00Z",
"2019-11-28T13:14:00Z"
],
"Values": [
18282.56,
16626.54,
14270.99
],
"StatusCode": "Complete"
}
]
But I see the following in Elasticsearch:
Time (i.e. @timestamp) aws.metrics.Duration.max
Nov 28, 2019 @ 13:20:21.748 18,282.56
Nov 28, 2019 @ 13:19:21.748 18,282.56
Nov 28, 2019 @ 13:18:21.748 16,626.54
Nov 28, 2019 @ 13:17:21.748 16,626.54
Nov 28, 2019 @ 13:16:21.748 14,270.99
Nov 28, 2019 @ 13:15:21.748 14,270.99
Notice the times do not match up exactly, and there are duplicate events (one duplicate for each, because the metricset period is 60s)
Is there any way to configure Metricbeat (or another Elastic product, like another Beat or Logstash) to import the CloudWatch metrics exactly?
Thanks for reading,
Paul