Hello,
I am working in forensics and have a lot of log files, each 100MB in size.
I feel very uncomfortable using filebeat because it throws a lot of "too many open files" errors. I am not sure if every file is being processed.
I am aware of this article: Too many open file handlers | Filebeat Reference [8.11] | Elastic
Reading this article makes me wonder if filebeat is the right tool to import bulk archived files into elasticsearch:
Filebeat keeps the file handler open in case it reaches the end of a file so that it can read new log lines in near real time.
I have no active files, every file is closed and I don't want filebeat to held the file handler open.
So is there a beat that will process one file after another and not watches for active files?
I know there are many options in filebeat, but filebeat is a good active logfile reader and file watcher. And I don't need these functions for my archived log files.
Thank you
Chr1s