Hello all, i hope some friendly soul can help me debug why my setup is not working.
First off, the goal: Syslogs --> Container running Logstash --> AWS S3 bucket
I have the container up and running without errors, however when i try to have my added logfile "testsyslogs.log" shipped to S3 nothing happens for some reason. Ive been browsing and browsing but cant find any examples of people using this plugin, learning by trial-and-error is really starting to get to me.
I validated the logstash.conf file with "logstash --config.test_and_exit -f /usr/share/logstash/config/logstash.conf"
and it returned OK.
I used the AWS CLI and succesfully uploaded files to the bucket, so the credentials used are solid.
To verify that the logstash container is reading and outputting my testsyslogs.log i browsed the container logstash logfile which were nowhere to be find, not even in the declared directory.
So my questions:
- How do i verify that logstash is even reading my testsyslog during the input?
- What am i missing for the setup to work?
Thanks in advance!
My dockerfile
FROM docker.elastic.co/logstash/logstash:7.3.1
COPY --chown=logstash:root logstash.conf ./config/logstash.conf
#Overwrite sample filen med vores.
COPY --chown=logstash:root logstash.conf ./config/logstash-sample.conf
COPY --chown=logstash:root testsyslog.log ./config/testsyslog.log
EXPOSE 9500
EXPOSE 5044
My logstash.conf
input
{
file { path => "/usr/share/logstash/config/testsyslogs.log" }
stdin { }
syslog {
port => 5044
codec => cef
}
}
output
{
stdout { codec => rubydebug }
s3{
access_key_id => "XXXXXXXXXXXXXXXXXXX"
secret_access_key => "XXXXXXXXXXXXXxxx"
region => "eu-central-1"
bucket => "logstashfullXXXXXXxx"
size_file => 2048
time_file => 5
codec => "plain"
canned_acl => "private"
}
}