Logstash container using AWS-S3-output-plugin not shipping logs

Hello all, i hope some friendly soul can help me debug why my setup is not working.

First off, the goal: Syslogs --> Container running Logstash --> AWS S3 bucket

I have the container up and running without errors, however when i try to have my added logfile "testsyslogs.log" shipped to S3 nothing happens for some reason. Ive been browsing and browsing but cant find any examples of people using this plugin, learning by trial-and-error is really starting to get to me.
I validated the logstash.conf file with "logstash --config.test_and_exit -f /usr/share/logstash/config/logstash.conf"
and it returned OK.
I used the AWS CLI and succesfully uploaded files to the bucket, so the credentials used are solid.

To verify that the logstash container is reading and outputting my testsyslogs.log i browsed the container logstash logfile which were nowhere to be find, not even in the declared directory.

So my questions:

  1. How do i verify that logstash is even reading my testsyslog during the input?
  2. What am i missing for the setup to work?

Thanks in advance!

My dockerfile

FROM docker.elastic.co/logstash/logstash:7.3.1

COPY --chown=logstash:root logstash.conf ./config/logstash.conf
#Overwrite sample filen med vores.
COPY --chown=logstash:root logstash.conf ./config/logstash-sample.conf

COPY --chown=logstash:root testsyslog.log ./config/testsyslog.log


My logstash.conf

file { path => "/usr/share/logstash/config/testsyslogs.log" }

stdin { }

syslog {
port => 5044
codec => cef

stdout { codec => rubydebug }

  access_key_id => "XXXXXXXXXXXXXXXXXXX"             
  secret_access_key => "XXXXXXXXXXXXXxxx" 
  region => "eu-central-1"                   
  bucket => "logstashfullXXXXXXxx" 
  size_file => 2048                        
  time_file => 5                           
  codec => "plain"                         
  canned_acl => "private"                  


Seems it ignores my conf file and try to ship logs to elasticsearch:9200 for some reason...

It is trying to connect to elasticsearch:9200 to check the x-pack licence.

The file input by default only generates events when data is appended to the file it is monitoring. You might want to try adding these to the file input

start_position => beginning
sincedb_path => "/dev/null"

ahh so i can ignore that i guess. Id like to try these but what are they supposed to do?

But i still experience a problem where i can only connect input to the default Beats port 5044, and not on the ports that i actually assigned myself for some reason. Since it shouldnt take input wiht Beats but with syslogs, jsonlogs, stdin and files, im confused as to why this Filebeat/port 5044 seems to be the only thing showing in the logger?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.