Filebeat has to be restarted to send logs to elasticsearch


(Sasanka Mourya) #1

I can see that logs are added only after i restart filebeat in client machine. Everytime I have to restart to get the logs updated. Need help in figuring out where the fault is.
Thanks in advance.


(Sasanka Mourya) #2

UPDATE : with scan_frequency set im able to update logs dynamically. But the process is very slow. Is there a way to find out how fast logstash is sending files to elastic search?


(Thiago Souza) #3

Can you please share your filebeat configuration and log file (if possible)?


(Sasanka Mourya) #4

filebeat.prospectors:

  • input_type: log

    paths:

    • /var/log/auth.log
    • /var/log/syslog
    • /var/www/magento/var/log/system.log
    • /var/www/magento/var/log/exception.log
      scan_frequency: 1s

output.logstash:
hosts: ["logstash.computenext.com:5443"]
bulk_max_size: 2048
ssl.certificate_authorities: ["/etc/filebeat/logstash.crt"]
template.name: "filebeat"
template.path: "filebeat.template.json"
template.overwrite: false

This is my filebeat config file.
Is there anything that i can do to my filebeat.yml to speed up the process?
sorry for the format. I am new here.


(Thiago Souza) #5

The scan_frequency setting is for configuring how fast filebeat picks up new logs files. Your configuration is set to read fixed log files only, so the scan_frequency should have effect here.

Also, those template.* settings are not valid logstash output settings. This should not be causing any problem, but I recommend to remove it.

Lastly, besides the issue above, I don't see any other issue with your configuration that could be causing what you are reporting. If you share the filebeat logs maybe we can find out other issues.


(Sasanka Mourya) #6

Well that had some effect. I reduced the scan_frequency and it doubled the thoroughput. Thanks for the help @thiago.

If possible have a look at another issue that i've been facing. Here is the link
Filebeat in windows is unable to send logs to logstash in Ubuntu server

Thanks in advance.


(Thiago Souza) #7

I am glad that your problem is solved but keep in mind that the setting scan_frequency does not configures how fast a log file is read. Also setting scan_frequency to less then 1s is not recommended.

I strongly recommend that you read the documentation about scan_frequency setting


(Sasanka Mourya) #8

Getting the double throughput must be because of restarting filebeat. I've just read what is there in your link and found that i can use close_inactive to get the logs close to real time. Thanks a lot @thiago.


(system) #9

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.