How to parse AWS Cloudfront logs

Tring to get filebeat to parse AWS Cloudfront logs.
SQS seem to work. but once in ES I see:

Provided Grok expressions do not match field value: [2022-01-06\t20:06:28\tFRA56-C2\t1729\t3.125.241.170\tGET\\t/.well-known/...

- module: aws
    enabled: true
    var.queue_url: "${SQS_QUEUE}"
    var.access_key_id: "${AWS_ACCESS_KEY_ID}"
    var.secret_access_key: "${AWS_SECRET_ACCESS_KEY}"

Tried elb and aws-s3 to no avail.

I am running filebeat in k8s dockerimage:

Cloudfront logs are documented

Sample log:

#Version: 1.0
#Fields: date time x-edge-location sc-bytes c-ip cs-method cs(Host) cs-uri-stem sc-status cs(Referer) cs(User-Agent) cs-uri-query cs(Cookie) x-edge-result-type x-edge-request-id x-host-header cs-protocol cs-bytes time-taken x-forwarded-for ssl-protocol ssl-cipher x-edge-response-result-type cs-protocol-version fle-status fle-encrypted-fields c-port time-to-first-byte x-edge-detailed-result-type sc-content-type sc-content-len sc-range-start sc-range-end
2022-01-06	14:37:55	MUC50-P1	405	GET	/versioninfo/version	200	-	curl/7.68.0	-	-	Miss	aekvP2AgpYb1uYYV1dd8lCjcUwhEJGQucpuvgzafdJ1XXscToIlnPg==	https	61	0.051	-	TLSv1.3	TLS_AES_128_GCM_SHA256	Miss	HTTP/2.0	-	-	49716	0.051	Miss	text/plain;%20charset=utf-8	-	-	-

Any help or pointers are welcome.

CloudFront logs and S3 access logs are not the same. You'll have to use a manualaws-s3 input and define a custom ingest pipeline to parse it.

Thank you. Any pointers on examples or docs? there's not a great deal of information on this it seems.

AWS S3 input | Filebeat Reference [7.16] | Elastic for how to setup the input. I built this pipeline last night, [AWS] Add CloudFront logs datastream by legoguy1000 · Pull Request #2476 · elastic/integrations · GitHub. You'll just have to point filebeat to this pipeline in ES either at the input or the out.

I'm very impressed! Hope this PR gets accepted soon.

Thank you.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.