Limit the number of config files logstash reads from a directory

I'll try and outline the problem we're trying to solve. We have a process that automatically writes out multiple files (the content, layouts, filenames are all unique and different) into a specific directory. The files can vary in size too (small to a few gigs). Assume there are 100 files in this directory. We would like logstash to process X files at a single time. We can dynamically create config files for each file into a separate directory and pass the directory into logstash, however it would attempt to load them all at once. Is this controllable?

Is there a way to loadbalance multiple instances of logstash that can monitor this one directory?

Thanks in advance!!

There's no way to do this based on numbers, you could do it on file name matching though though, eg A*, B*, 0*, 1* etc

Is there a way to queue and load balance logstash in any form when processing multiple files?

I was reading up about Persistent Queues? Can this feature be leveraged in any way?

Persistent queues won't solve this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.