Logstash-output-s3

I'm seeing an issue with the above plugin, version 5.2. I'm running it in a docker container on kubernetes. It works for about 5 hours before crashing the container with errors like this:

09:51:44.250 [[main]-pipeline-manager] ERROR logstash.outputs.s3 - Error validating bucket write permissions! {:message=>"No space left on device", :class=>"IOError"}

and

 @output_class=LogStash::Outputs::S3>", :error=>"Logstash must have the privileges to write to root bucket `cnc-logs`, check you credentials or your permissions."}

It quite happily writes to the bucket before this, though. I haven't yet checked the disk space usage on the host node. Could that be what's causing this behaviour?

Thanks in advance...

09:51:44.250 [[main]-pipeline-manager] ERROR logstash.outputs.s3 - Error validating bucket write permissions! {:message=>"No space left on device", :class=>"IOError"}

The S3 output use the local file system to asynchronously publish events to S3, so this mean the output can't write more stuff to the disk. You can use the temporary_directory option to specify the directory to use, by default Logstash will use the OS TMP dir.

@output_class=LogStash::Outputs::S3>", :error=>"Logstash must have the privileges to write to root bucket cnc-logs, check you credentials or your permissions."}

Usually Logstash expects to be able to have full permission to write stuff on the bucket (this is the mon common configuration), do you only allow the logstash users to write in some directory in the bucket? If you are using finer grained control on the bucket you can disable this by adding this to your configuration validate_credentials_on_root_bucket => false

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.