Configure Logstash-Forwarder to process old files

Hi all,

I'm trying to test a Logstash-Forwarder -> Logstash feed and it seems that Logstash-Forwarder by default skips files where the entries are older than 24 hours.

I've hit this issue because the test files I happen to be using are a couple of months old. However, it's possible that in our 'real' solution some log files would not get written to for 24 hours or more, and I would still want Logstash-Forwarder to continue monitoring these files and forward events to Logstash when they were written to.

Is it possible to configure Logstash-Forwarder to monitor all the files on my path, no matter the age of the last event written to them?

Note that I reset the timestamp of my test files to the current date/time to see if that resolved the issue (although this isn't a proper fix) but it didn't - which suggests that its the date/time of the events within the file rather than the file iself?

Unless I'm doing something dumb (more than possible ... :smile: ) then this seems like a fairly significant issue for us chosing to use Logstash-Forwarder as a solution.

Cheers,
Steve

Each JSON object in a files array can contain a dead time property that, indeed, defaults to 24 h. This should set the dead time to 100 days for a set of files:

{
  "files": [
    {
      "paths": [
        ...
      ],
      "dead time": "2400h"
    }
  ],
  ...
}

Hi Magnus,

Thanks for the quick response. I'll retry with this parameter but it looks like it would help.

For my information, do you know what LSF is using to determine when a file is too old. Is it last modified time of the file itself (I reset this to the current time and it seemed to make no difference) or some other value?

Regards,
Steve

AFAICT from prospector.go:120 it's the file's modification time.