Filebeat in endless loop

I have filebeat reading s3 logs but it sometimes get stuck in an endless loop and sometimes it just hangs and (rarely) goes as expected.

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /home/linux/logs/s3-dev-logs/*
    - /home/linux/logs/cloudfront-dev-logs/*.gz
 close_timeout: 5m
 output.logstash:
   hosts: ["localhost:5044"]

What I mean by endless loop is that Filebeat just keeps re-reading the log files

All (Filebeat, Logstash and Elasticsearch) are 6.3.2.

The command that I run filebeat is:
./filebeat -e -once -c filebeat.yml -d "publish"

and the os is rhls7.4

Are you trying to read gzip logs? AFAIK Filebeat does not handle this. You would need to extract them first. Probably you should be setting exclude_files to skip .gz.

Even when I comment the .gz part out, it gets stuck in an endless loop most of the time. As an aside, the s3-dev-logs are ASCII text, if that makes any difference..
Also, how can I kill filebeat, as control+c doesn't work and it just stays alive and I have to close the terminal session to kill it.

If you use -once then you should also set close_eof: true in order for the harvester to exit. Once all havesters exit then the Beat should stop. See the description of the -once flag in the docs.

In order for the harvester to stop the data also needs to be successfully delivered to the output.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.