Issues connecting to S3 bucket from Logstash

Hello all,
I'm working on setting up a Logstash server in AWS and have that server pull logs from an S3 bucket. When I start up my Logstash instance, I see the below error in the logs.

[2018-07-12T14:25:24,653][INFO ][logstash.inputs.s3 ] Registering s3 input {:bucket=>"bucket-name", :region=>"us-east-1"}
[2018-07-12T14:25:25,025][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2d782c29 run>"}
[2018-07-12T14:25:25,139][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-07-12T14:25:25,767][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-12T14:25:27,415][ERROR][logstash.inputs.s3 ] S3 input: Unable to list objects in bucket {:prefix=>nil, :message=>"The AWS Access Key Id you provided does not exist in our records."}

I followed the steps outlined here to set up my conf file.

My conf file has the below values for the S3 bucket, I'm not sure why I keep getting that message if my keys are provided in the conf file.

input {
   s3 {
    "bucket" => "bucket-name"
    "access_key_id" => "Axxxxxxxxxxxxxx"
    "secret_access_key" => "sxxxxxxxxxxxx"
    "region" => "us-east-1"
  }
}

filter {
    grok {
            match => { "message" => "%{TIMESTAMP_ISO8601:date_time} %{LOGLEVEL:log-level} %{DATA:key} %{GREEDYDATA:application} %{DATA:action} %{DATA:result} %{DATA:details}" }
    }
  }

output {
    elasticsearch {
            hosts => ["vpc-ivr-core-dr2mlk5allufnmlrpya2sdtd2y.us-east-1.es.amazonaws.com:443"]
            index => "my_index"
            }
          }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.