Hi All
I am running a logstash container with s3 pipeline as per below configuration:
input {
s3 {
id => "pipeline_s3_example_bucket_input"
bucket => "example-bucket"
region => "ap-southeast-1"
access_key_id => "#######################"
secret_access_key => "#######################"
codec => "json_lines"
sincedb_path => "/sincedbs/pipeline_s3_example_bucket.sincedb"
prefix => "folderA"
add_field => {
"type" => "example-bucket-logs"
"host" => "example-bucket"
}
}
}
My s3 bucket is encrypted with AWS-KMS using a custom managed key. I am using below IAM policy for user to read the bucket data :
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "IAMPolicy",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"kms:Decrypt",
"s3:GetBucketLogging",
"kms:GenerateDataKey",
"kms:DescribeKey",
"s3:GetObjectTagging",
"s3:ListBucket",
"s3:GetBucketVersioning",
"s3:GetBucketLocation",
"s3:GetObjectVersion"
],
"Resource": [
"arn:aws:kms:ap-southeast-1:<account-id>:key/<Key-ID>",
"arn:aws:s3:::example-bucket",
"arn:aws:s3:::example-bucket/*"
]
}
]
}
I also added the user arn in the Key users, I tested the access using aws cli commands and that worked as well. However, logstash is not able to pull those logs and I cant see any error in container logs as well. Please help if anyone else has faced similar issue.