S3 output plugin doesn't flush the data on shutdown


(Kiril Kirov) #1

Version - Logstash 6.0.0-beta1
logstash-output-s3 version 4.09
OS - Windows Server 2012 R2 Standard
JAVA_VERSION="1.8.0_92"
Logstash is running as a service by using NSSM. NSSM calls C:\logstash-6.0.0-beta1\bin\Logstash.bat with parameters "-f logstash.config". I'm attaching logstash.config for reference.

We are planning on running logstash to archive log files from our App servers on S3.
The app servers where logstash is going to be running are in Auto Scaling Group, so it is unpredictable when a shutdown may happen, so I need to be sure that Logstash flushes all the files from the temporary folders to S3 on shutdown.
The shutdown funcionality works on Logstash version 5.5.1 with logstash-output-s3 version 4.09, but there is another issue with "Permission denied" when trying to remove the temporary files.
So we are looking into going straight to Logstash 6.0.0-beta1, but on shutdown the the temp files are not flushed to S3 (so we are losing logs).
On shutdown are not getting the line in the logstash log file that indicates that the is being closed (The following line appears only with 5.5.1):
[2017-08-15T21:41:59,044][DEBUG][logstash.outputs.s3 ] closing {:plugin=>"LogStash::Outputs::S3"}

After restart we are getting a line like this:

[2017-08-15T14:24:08,001][DEBUG][logstash.outputs.s3 ] Recovering from crash and uploading {:file=>"C:/Windows/TEMP/logstash/2d66a74e-109b-4428-aeba-29d2c3433f60/Logs/y=2017/m=08/d=15/h=14/s=LSA/l=ServiceOutgoingAudit/ls.s3.0dd8d641-d182-4ef9-b7a0-4dd609a1a9b4.2017-08-15T14.22.part0.txt.gz"}

showing that there was a crash.

The steps to reproduce it are simple - run the service, add lines to the log files that are being monitored, wait for 2-3 minutes and shutdown the service.
The newly added lines don't make it to S3. After the service is started again the new lines are delivered and we are seeing the "Recovering from crash and uploading" message.

Has anyone seen this?
Are there any workarounds with settings etc.?

Here is my config:
##########################
input {
file {
type => "ApplicationLogs"
path => "C:/Logs//.log"
codec => multiline {
pattern => "^[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3}"
negate => true
what => "previous"
}
}
file {
type => "IISLogs"
path => "C:/Logs//IIS//*.log"
}
}
filter {
if [type] == "ApplicationLogs" {
grok {
match => { "path" => "C:/Logs/(?[^/]+)/(?[^-[0-9]{4}-[0-9]{2}-[0-9]{2}]+)" }
}
mutate {
gsub => ["message", "\r\n", "|"]
}
}
if [type] == "IISLogs" {
if [message] =~ "^#" {
drop {}
}

	grok {
		match => { "path" => "C:/Logs/(?<servicename>[^/]+)/" }
	}
}

}
output {
if [type] == "ApplicationLogs" {
s3{
region => "us-west-2"
bucket => "my_bucket"
prefix => "Logs/y=%{+yyyy}/m=%{+MM}/d=%{+dd}/h=%{+HH}/s=%{servicename}/l=%{logtype}/"
codec => line { format => "%{message}" }
encoding => "gzip"
}
stdout { codec => rubydebug }
}
if [type] == "IISLogs" {
s3{
region => "us-west-2"
bucket => "my_bucket"
prefix => "Logs/y=%{+yyyy}/m=%{+MM}/d=%{+dd}/h=%{+HH}/s=%{servicename}/l=IIS/"
codec => line { format => "%{message}" }
encoding => "gzip"
}
}
}
###########################


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.