S3 Input Plugin randomly fails


(Kraig Paulsen) #1

I'm using Logstash version 6.0.1 and S3 plugin logstash-input 3.2.0. I'm trying to process Web Server IIS Access logs files from a S3 bucket. The files are large 2.4 GB. The issue is that it works basically but randomly the Logstash will stop being able to process files from the bucket and in the logs I get

[2018-01-12T09:10:55,462][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::S3 access_key_id=>"AKIAJFN3SD4G4GILLZPQ", bucket=>"egain-cloud-test", region=>"us-east-1", secret_access_key=>"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", type=>"testlog", sincedb_path=>"/home/elastic/sincedb/test_sincedb.db", prefix=>"2017/04/24/u_ex", backup_add_prefix=>"archived-", backup_to_bucket=>"egain-cloud-test", interval=>120, delete=>true, add_field=>{"deploymentid"=>"test", "pipe"=>"iislogs"}, id=>"26cb7e9e5a1189724f9ae62ad6a0e63027404a41c41a409125a94cf4f5a8a8d6", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_a405d322-89a5-4fce-9bb2-20d58a3df6e3", enable_metric=>true, charset=>"UTF-8">, temporary_directory=>"/tmp/logstash">
Error: Failed to open TCP connection to egain-cloud-test.s3.amazonaws.com:443 (initialize: name or service not known)
Exception: Seahorse::Client::NetworkingError
Stack: org/jruby/ext/socket/RubyTCPSocket.java:137:in initialize' org/jruby/RubyIO.java:1154:inopen'
/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:885:in block in connect' org/jruby/ext/timeout/Timeout.java:149:intimeout'
/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:883:in connect' /usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:868:indo_start'
/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:863:in start' /usr/share/logstash/vendor/jruby/lib/ruby/stdlib/delegate.rb:83:inmethod_missing'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/net_http/connection_pool.rb:281:in start_session' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/net_http/connection_pool.rb:93:insession_for'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/net_http/handler.rb:116:in session' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/net_http/handler.rb:68:intransmit'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/net_http/handler.rb:42:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins/content_length.rb:12:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_request_signer.rb:88:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_request_signer.rb:23:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/xml/error_handler.rb:8:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_request_signer.rb:65:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_redirects.rb:15:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:87:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:118:in retry_request' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:101:inretry_if_possible'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:89:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:118:inretry_request'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:101:in retry_if_possible' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:89:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:118:in retry_request' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:101:inretry_if_possible'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/retry_errors.rb:89:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_accelerate.rb:42:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_md5s.rb:31:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_expect_100_continue.rb:21:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_bucket_name_restrictions.rb:12:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_bucket_dns.rb:31:incall'
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/rest/handler.rb:7:in call' /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/user_agent.rb:12:incall'

:

usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins
/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:574:in inputworker' /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:567:inblock in start_input'


(Kraig Paulsen) #2

This is my logstash config
input {
s3 {
access_key_id => "xxx"
bucket => "test"
region => "xxx"
secret_access_key => "xxx"
type => "testlog"
sincedb_path => "/home/elastic/sincedb/test_sincedb.db"
prefix => "2017/04/24/u_ex"
backup_add_prefix => "archived-"
backup_to_bucket => "test"
interval => 120
delete => true
add_field => {"deploymentid" => "test"}
add_field => {"pipe" => "iislogs"}
}
}

filter {
if [message] =~ "#" {
drop{}
}
}

output {
elasticsearch {
hosts => "http://test"
pipeline => "%{pipe}"
index => "%{deploymentid}_filebeat-%{+yyy.MM.dd}"
}
}

Any help would be appreciated since this seems like basic functionality that should work and not randomly just stop working. The only way I can correct it is to have to shut down logstash and then restart it


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.