How to deal with large number of files in a single directory

Filebeat scans the whole directory to get files which needs to be collected. But when there is a large number of files in the directory, filebeat creates multiple goroutines to collect files causing high I/O wait. Can anyone help me?

@Lihang_Gong Yes by default Filebeat creates one goroutine for each discovered files, in your case it could cause a lot of goroutines. But its possible to set a hard limit on the number of concurrent files been read, see the harvester_limit options in our documentation.

Thanks. I'm gonna try it.:grin:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.