How to deal with large number of files in a single directory

(Lihang Gong) #1

Filebeat scans the whole directory to get files which needs to be collected. But when there is a large number of files in the directory, filebeat creates multiple goroutines to collect files causing high I/O wait. Can anyone help me?

(Pier-Hugues Pellerin) #2

@Lihang_Gong Yes by default Filebeat creates one goroutine for each discovered files, in your case it could cause a lot of goroutines. But its possible to set a hard limit on the number of concurrent files been read, see the harvester_limit options in our documentation.

(Lihang Gong) #3

Thanks. I'm gonna try it.:grin:

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.