[2016-11-03T23:22:37,565][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Plugin: <LogStash::Inputs::S3 bucket=>"my-bucket", region=>"us-east-1", prefix=>"prefix/", exclude_pattern=>"_temporary", delete=>true, backup_add_prefix=>"logstashed/", backup_to_bucket=>"my-bucket", interval=>60, type=>"message", id=>"d75fedeb20a07924471e768423feb6c6c8053d54-1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_43d6abe1-f961-41e3-af17-aae7f6e51581", enable_metric=>true, charset=>"UTF-8">, temporary_directory=>"/tmp/logstash">
Error: uninitialized constant Aws::Client::Errors
Using logstash from within a docker container on AWS. I do see activity on the ES instance but none of the S3 files gets deleted or moved to the backup folder.
I just noticed that the readme of the logstash-input-s3 plugin is missing some information:
The s3:PutObject action is also required in the case a backup bucket is used.
So I'd assume that is error generally has to do with insufficient permissions for the particular S3 bucket.
Just in case anyone else finds this post the same way I did, this particular issue can be a permissions problem (https://github.com/logstash-plugins/logstash-input-s3/issues/99). The way I diagnosed this was with awscli -- i did an aws s3 ls bucket/prefix/ and that worked, but an aws s3 cp s3://bucket/prefix/file /tmp/x didn't work. It tuns out that your policy needs to look something like this:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.