After restart logstash the s3 output plugin not work

(sockaddr_in) #1

Hello, everyone!
I use the s3 as the logstash output. When i first start the logstash, it works well, but after i restart logstash is does not work at all and generated many ERROR log as following, so how to solve this problem.And another hand, how to config logstash transport log to one file daily, not many parts of file. Ex:today is 2017-10-08, then logstash will transport today's log to s3 one file named xxx.log-2017-10-08, and next day will create a log file in s3 named xxx.log-2017-10-09 and so on.

[2017-09-30T04:35:46,069][ERROR][logstash.outputs.s3 ] Uploading failed, retrying {:exception=>Errno::ENOENT, :message=>"No such file or directory - No such file or directory - /tmp/logstash/0658030e-b140-4523-8338-5ececfafecc3/crmweb/homethy-2017-09-30/ls.s3.d4704669-768b-4b81-aa15-2f50fc18c6ae.2017-09-30T04.30.part18.txt", :path=>"/tmp/logstash/0658030e-b140-4523-8338-5ececfafecc3/crmweb/homethy-2017-09-30/ls.s3.d4704669-768b-4b81-aa15-2f50fc18c6ae.2017-09-30T04.30.part18.txt", :backtrace=>["org/jruby/ size'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:31:inupload'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/object.rb:251:in upload_file'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.8/lib/logstash/outputs/s3/uploader.rb:38:inupload'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.8/lib/logstash/outputs/s3/uploader.rb:29:in upload_async'", "org/jruby/'", "/home/ec2-user/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/java_executor_service.rb:94:in run'", "Concurrent$$JavaExecutorService$$Job_2081311098.gen:13:inrun'"]}

The following is my ENV:

Logstash Version: 5.5.1

The following is part of my logstash config(s3 output config):

access_key_id => "AKIAIBSP2AAM2TDH2QIA"
secret_access_key => "xxxxxxxxxxx"
region => "ap-northeast-1"
prefix => "crmweb/access-%{date_str}"
bucket => "xxxxxxx"
size_file => 102400000
codec => line { format => "%{message}" }

(sockaddr_in) #2

I just have a test, if there is only one s3 config block in logstash config file, that worked well, otherwise it would not work and generated many error logs as above.
Logstash s3 plugin version: logstash-output-s3-4.0.8

(sockaddr_in) #3

Ok, now i have solved the problem, the solution from this github issue:
So excited!!! :grinning:

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.