Hi, so I'm having trouble figuring out how to set up logtash-forwarder to handle log files that rotate, as in, change their name every day and log everything to a new file. We're kind of new to logstash and elasticsearch here, so we're not sure how much you're able to do here, and we're struggling a bit with the docs.
Here's basically what we want to do - there's a postgres server whose logs we want to forward. They're located in /opt/pgdata/pg_log. The current log file for today is called postgresql-2016-02-05_000000.log, and tomorrow it will be postgresql-2016-02-06_000000.log. We want all of those log files to be forwarded. Pretty simple right? We're just failing really hard at figuring this out. We'd be thankful for any help you can give us.
File path wildcards are supported in logstash-forwarder configurations. Quoting the example from the README file:
# The list of files configurations
# An array of hashes. Each hash tells what paths to watch and
# what fields to annotate on events from those paths.
# single paths are fine
# globs are fine too, they will be periodically evaluated
# to see if any new files match the wildcard.
So... use /opt/pgdata/pg_log/postgresql-*.log as a pattern?
If the problem is that you want to ingest all old logs yet LSF won't touch them note that LSF by default ignores files older than 24 hours. You can adjust that with the
dead time configuration option (see the documentation of Go's time package for syntax details):
"dead time": "8760h"
(Keep in mind that logstash-forwarder is deprecated and has been replaced by Filebeat.)
Thanks, that seems to have worked. Haha I feel kinda dumb for missing that option, I think we've even used the wildcard for apache logs in the past. And yeah we're aware of Filebeat, but we're sticking with logstash-forwarder for now.