Logstash and ~8000 files in ~40 file inputs?

Hi guys,

I have a specific situation where I want to index ~8000 files from a network share. Those are IIS logs that are rotated regularly.

Is this something that one 2.0.0 agent on Linux can handle, or is it maybe smarter to split the workload into multiple separate agents?

Some of these logs will be indexed once and that's it (because they are rotated files), while other wil have to be indexed in real time (logs will flow to them).

I've tried with a single file {} input, but after some time processing would just halt.

Do you guys have experience with such a high number of files?

That's a lot of files!

You will want to split things out into different input sections so that you thread things as much as possible. A single input with a bunch of wildcards is going to appear to stall as it is only single threaded, so it will just sit and block while it reads each file.

Do I need to have multiple input sections, like this:

input { file {} }
input { file {} }

or just multiple file sections:

input {
file {}
file {}
}

that :slight_smile: