Hi,
I'm using the s3 output plugin to output to a private S3 bucket on AWS. Using the option canned_acl => "private" works fine but setting it to "authenticated_read" results in errors such as this:
[ERROR][logstash.outputs.s3 ] Uploading failed, retrying {:exception=>Aws::S3::Errors::InvalidArgument, :message=>"", :path=>"/tmp/logstash/d34322eb-45dd-4a0c-8b89-a1d3ec41083b/Logs/2017-04-17/ls.s3.9a34ea3d-4d18-4456-a7b4-6daeda20f12b.2017-04-17T15.44.part0.txt", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins/raise_response_errors.rb:15:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_sse_cpk.rb:19:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_accelerate.rb:33:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/param_converter.rb:20:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins/response_target.rb:21:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/request.rb:70:in `send_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/base.rb:207:in `put_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:42:in `put_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:49:in `open_file'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:41:in `put_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:34:in `upload'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/object.rb:251:in `upload_file'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.5/lib/logstash/outputs/s3/uploader.rb:38:in `upload'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.5/lib/logstash/outputs/s3/uploader.rb:29:in `upload_async'", "org/jruby/RubyProc.java:281:in `call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-1.0.0-java/lib/concurrent/executor/java_executor_service.rb:94:in `run'", "Concurrent$$JavaExecutorService$$Job_1733552426.gen:13:in `run'"]}
Sanitized output configuration:
s3 {
canned_acl => "authenticated_read"
codec => "json_lines"
time_file => 10
size_file => 5242880
}
The bucket is a private bucket, similar errors also occur with using "public_read" as canned_acl.