I'll try and outline the problem we're trying to solve. We have a process that automatically writes out multiple files (the content, layouts, filenames are all unique and different) into a specific directory. The files can vary in size too (small to a few gigs). Assume there are 100 files in this directory. We would like logstash to process X files at a single time. We can dynamically create config files for each file into a separate directory and pass the directory into logstash, however it would attempt to load them all at once. Is this controllable?
Is there a way to loadbalance multiple instances of logstash that can monitor this one directory?
Thanks in advance!!