S3 Access denied error in s3snssqs logstash plugin

Hi Team,

input {
  s3snssqs {
    region                     => "us-east-1"
    s3_default_options         => { "endpoint_discovery" => true }
    queue                      => "queue name"
    queue_owner_aws_account_id => "account id"
    access_key_id              => "jhjhjhjhjh"
    secret_access_key          => "gjjhjhjhjhjhjhjh" 
    role_arn                   => "assumed role with admin access"
    sqs_skip_delete            => true
    codec                      => line
    from_sns                   => false
    temporary_directory        => "Temp"
    s3_options_by_bucket => [
                                {
                                    bucket_name => "bucket name"
                                    access_key_id              => "hjhg"
                                    secret_access_key          => "uyuyuuyuyuyu" 
                                    role_arn                   => "ytuuuyuyuys"
                                }
                            ]

  }                 
}

output {
    stdout {  }
}

I was trying to pull data from another aws account using assume role.But I was getting the below error.
I have given admin access to that assume role.

Sending Logstash logs to D:/logstash/sqslog/logstash-7.8.0/logstash-7.8.0/logs which is now configured via log4j2.properties
[2020-08-30T20:38:58,309][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-08-30T20:38:58,488][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.8.0", "jruby.version"=>"jruby 9.2.11.1 (2.5.7) 2020-03-25 b1f55b1a40 Java HotSpot(TM) 64-Bit Server VM 25.191-b12 on 1.8.0_191-b12 +indy +jit [mswin32-x86_64]"}
[2020-08-30T20:39:02,002][INFO ][org.reflections.Reflections] Reflections took 77 ms to scan 1 urls, producing 21 keys and 41 values
[2020-08-30T20:39:40,571][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["D:/logstash/sqslog/logstash-7.8.0/logstash-7.8.0/pipelines/s3snssqs.conf"], :thread=>"#<Thread:0x57a24885 run>"}
[2020-08-30T20:39:41,701][INFO ][logstash.inputs.s3snssqs ][main] Registering SQS input {:queue=>"queue name"}
[2020-08-30T20:39:45,375][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-08-30T20:39:45,572][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-08-30T20:39:46,021][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-08-30T20:39:46,040][INFO ][logstash.inputs.s3snssqs ][main][c4679ab50877d70ccbc9c82e2320be8d5ad47d428a0b46e6734101f7f71d0f25] [Worker c4679ab50877d70ccbc9c82e2320be8d5ad47d428a0b46e6734101f7f71d0f25/0] started (2020-08-30 20:39:46 +0530)
[2020-08-30T20:39:49,250][ERROR][logstash.inputs.s3snssqs ][main][c4679ab50877d70ccbc9c82e2320be8d5ad47d428a0b46e6734101f7f71d0f25] Unable to download file. Requeuing the message {:error=>#<Aws::S3::Errors::AccessDenied: Access Denied>, :record=>{:bucket=>"bucket name", :key=>"docname.txt", :size=>19, :folder=>"", :local_file=>"C:/Users/user/AppData/Local/Temp/Temp20200830-12732-2glfu1/docname.txt"}}

Could someone help me to fix this issue? 

Thanks!