Hello.
Regarding the title, Is there any way to fix it?
I suspect it's a bug in the plugin.
- Version: logstash-input-s3-3.5.0
- Operating System: Windows Server 2016
- Config File:
input {
s3 {
bucket => "fukasawah-123123"
access_key_id => "*********"
secret_access_key => "******"
region => "ap-northeast-1"
prefix => "target/"
backup_to_bucket => "fukasawah-123123"
backup_add_prefix => "done/"
delete => true
interval => 60
watch_for_new_files => true
}
}
output {
stdout { codec => rubydebug }
}
- Sample Data: None
- Steps to Reproduce:
- Put "target/あいうえお.csv" to S3 Bucket
- run
logstash -f this.conf
- Expect: Copy to
#{backup_to_bucket}/#{backup_add_prefix}#{object.key}
(eg.fukasawah-123123/done/target/あいうえお.csv
) - Actual: NoSuchKey Error
Error log
[2020-08-13T11:49:33,699][ERROR][logstash.javapipeline ][main][43583c7fe524300b82b78d448ad2a716bb12e59208496b16db6d6b9c20e49f60] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::S3 bucket=>"fukasawah-123123", access_key_id=>"*****", backup_to_bucket=>"fukasawah-123123", prefix=>"target/", backup_add_prefix=>"done/", secret_access_key=><password>, interval=>60, id=>"43583c7fe524300b82b78d448ad2a716bb12e59208496b16db6d6b9c20e49f60", region=>"ap-northeast-1", watch_for_new_files=>true, enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_6df6be98-b9ba-4b6a-8304-*************", enable_metric=>true, charset=>"UTF-8">, role_session_name=>"logstash", delete=>false, temporary_directory=>"C:/Users/ADMINI~1/AppData/Local/Temp/2/logstash", include_object_properties=>false, gzip_pattern=>".gz(ip)?$">
Error: The specified key does not exist.
Exception: Aws::S3::Errors::NoSuchKey
Stack: C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/seahorse/client/plugins/raise_response_errors.rb:15:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/s3_sse_cpk.rb:19:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/s3_dualstack.rb:24:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/s3_accelerate.rb:34:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/jsonvalue_converter.rb:20:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/idempotency_token.rb:18:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/aws-sdk-core/plugins/param_converter.rb:20:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/seahorse/client/plugins/response_target.rb:21:in `call'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/seahorse/client/request.rb:70:in `send_request'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-core-2.11.540/lib/seahorse/client/base.rb:207:in `block in define_operation_methods'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/aws-sdk-resources-2.11.540/lib/aws-sdk-resources/services/s3/object.rb:64:in `copy_from'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:160:in `backup_to_bucket'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:380:in `process_log'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:180:in `block in process_files'
org/jruby/RubyArray.java:1809:in `each'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:176:in `process_files'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:123:in `block in run'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/stud-0.0.23/lib/stud/interval.rb:20:in `interval'
C:/opt/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-s3-3.5.0/lib/logstash/inputs/s3.rb:122:in `run'
C:/opt/logstash/logstash-core/lib/logstash/java_pipeline.rb:374:in `inputworker'
C:/opt/logstash/logstash-core/lib/logstash/java_pipeline.rb:365:in `block in start_input'
AWS SDK Ruby v2 copy_from(:copy_source) must be URL-encoded, but it doesn't look like that's being done in the plugin's process.