S3 output plugin timeout error

I am new to Logstash. I am using version 1:5.0.0-1 on a Amazon Linux EC2 instance and trying to upload a file to S3 using the example config:

input {
file {
path => "/tmp/log.txt"
start_position => "beginning"
}
}
output {
s3 {
access_key_id => "my_secret_key"
secret_access_key => "my_secret_access_key"
region => "ap-southeast-2"
bucket => "my_bucket"
size_file => "2048"
time_file => "1"
server_side_encryption => "true"
canned_acl => "public_read_write"
}
}

I keep getting the below error:

[2016-11-04T14:42:05,304][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2016-11-04T14:42:08,290][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2016-11-04T14:42:19,426][INFO ][logstash.outputs.s3 ] Registering s3 output {:bucket=>"my_bucket_name", :endpoint_region=>"ap-southeast-2"}
[2016-11-04T14:43:21,585][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<Timeout::Error: execution expired>, :backtrace=>["org/jruby/ext/socket/RubyTCPSocket.java:111:in initialize'", "org/jruby/RubyIO.java:1197:inopen'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:763:in connect'", "org/jruby/ext/timeout/Timeout.java:115:intimeout'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:763:in connect'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:756:indo_start'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/net/http.rb:751:in start'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/http/connection_pool.rb:327:instart_session'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/http/connection_pool.rb:127:in session_for'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/http/net_http_handler.rb:56:inhandle'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:253:in make_sync_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:289:inretry_server_errors'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/s3/region_detection.rb:11:in retry_server_errors'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:249:inmake_sync_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:509:in client_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:391:inlog_client_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:477:in client_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:373:inreturn_or_raise'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:476:in client_request'", "(eval):3:input_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/s3/s3_object.rb:1765:in write_with_put_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/s3/s3_object.rb:611:inwrite'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-3.2.0/lib/logstash/outputs/s3.rb:182:in write_on_bucket'", "org/jruby/RubyIO.java:1201:inopen'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-3.2.0/lib/logstash/outputs/s3.rb:175:in write_on_bucket'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-3.2.0/lib/logstash/outputs/s3.rb:260:intest_s3_write'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-3.2.0/lib/logstash/outputs/s3.rb:232:in register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/single.rb:9:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:37:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:196:instart_workers'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:196:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:153:in run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:250:instart_pipeline'

I can see that the logstash-programmatic-access-test-object-1478285125 file is being created in /tmp/logstash.

I can successfully run "aws s3 ls s3://my_bucket_name" or "aws s3 cp testfile s3://my_bucket_name --sse" commands without any issues. Any idea of what the error might mean or what I could try to resolve it?

Thanks