I'm concerned about using Filebeat to gather information from multiple laptops running macOS, as some of them might be connected to cellular networks and it is therefore risky that huge logs might imply extra costs for the users.
Is there a clever way to limit transfers to, say, 100k per hour, or something like that? I thought about limiting the bulk_max_size, but again, I don't think that would help, since data will be transferred no matter what as long as the harvester is reading files.
There is no configuration option to impose a limit on the bandwidth used or the number of events that filebeat reports.
You can try using system tools (or a 3rd party app) to limit the bandwidth used by the filebeat process.
In macOS it might be possible using the builtin
Also you have a few options to reduce the bandwidth that is used:
- Reduce the number of log sources monitored.
- Use the max_bytes setting, to clip large log messages.
- Set a high compression level in the output.
Thank you Adrian. Unfortunately all the options you mention (except using the OS to reduce bandwidth) are unpredictable in that high amount of data will eventually mean high data transfer.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.