After restart logstash the s3 output plugin not work

Hello, everyone!
I use the s3 as the logstash output. When i first start the logstash, it works well, but after i restart logstash is does not work at all and generated many ERROR log as following, so how to solve this problem.And another hand, how to config logstash transport log to one file daily, not many parts of file. Ex:today is 2017-10-08, then logstash will transport today's log to s3 one file named xxx.log-2017-10-08, and next day will create a log file in s3 named xxx.log-2017-10-09 and so on.

[2017-09-30T04:35:46,069][ERROR][logstash.outputs.s3 ] Uploading failed, retrying {:exception=>Errno::ENOENT, :message=>"No such file or directory - No such file or directory - /tmp/logstash/0658030e-b140-4523-8338-5ececfafecc3/crmweb/homethy-2017-09-30/ls.s3.d4704669-768b-4b81-aa15-2f50fc18c6ae.2017-09-30T04.30.part18.txt", :path=>"/tmp/logstash/0658030e-b140-4523-8338-5ececfafecc3/crmweb/homethy-2017-09-30/ls.s3.d4704669-768b-4b81-aa15-2f50fc18c6ae.2017-09-30T04.30.part18.txt", :backtrace=>["org/jruby/RubyFileTest.java:239:in size'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:31:in upload'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/object.rb:251:in upload_file'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.8/lib/logstash/outputs/s3/uploader.rb:38:in upload'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.8/lib/logstash/outputs/s3/uploader.rb:29:in upload_async'", "org/jruby/RubyProc.java:281:in call'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/java_executor_service.rb:94:in run'", "Concurrent$$JavaExecutorService$$Job_2081311098.gen:13:in run'"]}

The following is my ENV:

OS:centos7
Logstash Version: 5.5.1

The following is part of my logstash config(s3 output config):

s3{
access_key_id => "AKIAIBSP2AAM2TDH2QIA"
secret_access_key => "xxxxxxxxxxx"
region => "ap-northeast-1"
prefix => "crmweb/access-%{date_str}"
bucket => "xxxxxxx"
size_file => 102400000
codec => line { format => "%{message}" }
}

I just have a test, if there is only one s3 config block in logstash config file, that worked well, otherwise it would not work and generated many error logs as above.
Logstash s3 plugin version: logstash-output-s3-4.0.8

Ok, now i have solved the problem, the solution from this github issue:
https://github.com/logstash-plugins/logstash-output-s3/issues/143
So excited!!! :grinning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.