Hi there !
I've been setting the aws
module and the ELB fileset with it, and got it mostly working.
It reads the SQS Queue, then get the matching log on S3 side, but it doesn't parse the Access Log message at all.
Im running Filebeat in K8s, using official image docker.elastic.co/beats/filebeat:7.9.2
.
Here is my config :
filebeat.modules:
- module: aws
elb:
enabled: true
var.queue_url: https://sqs.{{ aws_region }}.amazonaws.com/{{ aws_account_id }}/{{ sqs_queue_name }}
var.shared_credential_file: /etc/filebeat/aws_credentials
credential_profile_name: default
var.visibility_timeout: 300s
var.api_timeout: 120s
cloudtrail.enabled: false
cloudwatch.enabled: false
ec2.enabled: false
s3access.enabled: false
vpcflow.enabled: false
output.console:
enabled: true
Here is a sample output :
{
"@timestamp":"2020-10-01T13:38:07.424Z",
"@metadata":{
"beat":"filebeat",
"type":"_doc",
"version":"7.9.2",
"_id":"fa53066972-000002163821",
"pipeline":"filebeat-7.9.2-aws-elb-pipeline"
},
"aws":{
"s3":{
"bucket":{
"name":"???????????",
"arn":"arn:aws:s3:::??????????????"
},
"object.key":"??????/AWSLogs/???/elasticloadbalancing/us-east-1/2020/09/30/???????.log"
}
},
"tags":[
"forwarded"
],
"agent":{
"name":"ip-??-??-??-??.ec2.internal",
"type":"filebeat",
"version":"7.9.2",
"hostname":"ip-??-??-??-??.ec2.internal",
"ephemeral_id":"b132b4bc-5c12-47bc-9495-1e5b00747d80",
"id":"00f8a6fa-f9b4-40bf-b8fa-5cbd42e54a4c"
},
"ecs":{
"version":"1.5.0"
},
"message":"2020-09-30T22:49:17.793407Z aa4ac2f5ea65111eaa5a30e9585490a4 ??.??.??.??:46312 ??.??.??.??:31184 0.00003 0.014323 0.000057 200 200 0 356 \"GET http://????.lan:80/???????/?list-type=2&prefix=repositories%2Fnone%2Fdataset%2Ffull_content&fetch-owner=false HTTP/1.1\" \"aws-sdk-java/1.11.415 Linux/4.14.186-146.268.amzn2.x86_64 OpenJDK_64-Bit_Server_VM/11.0.6+10-LTS java/11.0.6\" - -",
"log":{
"offset":2163821,
"file.path":"https://?????.s3-us-east-1.amazonaws.com/???????/AWSLogs/?????/elasticloadbalancing/us-east-1/2020/09/30/?????.log"
},
"cloud":{
"provider":"aws",
"region":"us-east-1"
},
"event":{
"module":"aws",
"dataset":"aws.elb"
},
"fileset":{
"name":"elb"
},
"service":{
"type":"aws"
},
"input":{
"type":"s3"
}
}
As you can see, there are no aws.elb
fields created, even if dataset: "aws.elb"
and "pipeline":"filebeat-7.9.2-aws-elb-pipeline"
are correct.