We are trying to run Filebeat on a production server to ship logs to Logstash. However, it is failing to ship logs as quickly as they are being written. If I inspect the Filebeat's registry, it shows that over the past 3 hours it has only processed 26 GB of an 81 GB log file. Our expected volume is 400 - 800 GB of logs per day from this one log file. We are also seeing Filebeat take 60-70% CPU utilization on this server, which is a very beefy server. This makes us wonder if Filebeat can handle the task at hand.
We have checked our Logstash ingester (which sits in front of RabbitMQ) and it does not seem to be bottlenecking. The rest of our pipeline seems healthy.
I thank you for any suggestions.
filebeat.inputs: - type: log paths: - /mypath/log.csv processors: - add_fields: fields: myfield1: "1" myfield2: "2" myfield3: "3" logging.level: info logging.to_files: true logging.files: path: /var/log/filebeat name: filebeat keepfiles: 7 permissions: 0644 output.logstash: hosts: ["my-load-balancer.domain.com:5045"] ssl.certificate_authorities: ["/etc/pki/tls/certs/ca-bundle.crt"] worker: 2 bulk_max_size: 3200