Filebeat getting stopped automatically after sometime

Hi,

I am using 7.4.2 version filebeat. Issue is my filebeat service gets stopped automatically after running for sometime. But whenever I start it will crawl properly and log data into elastic search properly for certain duration. But unexpectedly after sometime (may be 1 or 2 hours time) filebeat service will get stopped. There are no evident errors in logs.

PFB the logs from filebeat runtime.

2020-07-22T09:55:36.814+0100    INFO    beater/filebeat.go:443  Stopping filebeat
2020-07-22T09:55:36.814+0100    INFO    crawler/crawler.go:139  Stopping Crawler
2020-07-22T09:55:36.814+0100    INFO    crawler/crawler.go:149  Stopping 4 inputs
2020-07-22T09:55:36.814+0100    INFO    cfgfile/reload.go:229   Dynamic config reloader stopped
2020-07-22T09:55:36.814+0100    INFO    input/input.go:149      input ticker stopped
2020-07-22T09:55:36.814+0100    INFO    input/input.go:167      Stopping Input: 11908303673130337906
2020-07-22T09:55:36.814+0100    INFO    input/input.go:149      input ticker stopped
2020-07-22T09:55:36.814+0100    INFO    input/input.go:149      input ticker stopped
2020-07-22T09:55:36.814+0100    INFO    input/input.go:167      Stopping Input: 4203590883583984468
2020-07-22T09:55:36.814+0100    INFO    input/input.go:149      input ticker stopped
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    input/input.go:167      Stopping Input: 6728572276079537273
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    input/input.go:167      Stopping Input: 13712394945884954598
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    harvester/forwarder.go:52       Input outlet closed
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    harvester/forwarder.go:52       Input outlet closed
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    log/harvester.go:272    Reader was closed: /path/filename.txt. Closing.
2020-07-22T09:55:36.814+0100    INFO    crawler/crawler.go:165  Crawler stopped
2020-07-22T09:55:36.815+0100    INFO    registrar/registrar.go:367      Stopping Registrar
2020-07-22T09:55:36.815+0100    INFO    registrar/registrar.go:293      Ending Registrar
2020-07-22T09:55:36.820+0100    INFO    [monitoring]    log/log.go:153  Total non-zero metrics  {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":10540,"time":{"ms":10547}},"total":{"ticks":68190,"time":{"ms":68203},"value":68190},"user":{"ticks":57650,"time":{"ms":57656}}},"handles":{"limit":{"hard":16000,"soft":16000},"open":10},"info":{"ephemeral_id":"b57f1c4d-7a80-4f1f-aaba-5ab9ee057757","uptime":{"ms":7119571}},"memstats":{"gc_next":22377264,"memory_alloc":11462592,"memory_total":18240359416,"rss":50831360},"runtime":{"goroutines":21}},"filebeat":{"events":{"added":528063,"done":528063},"harvester":{"closed":77,"open_files":0,"running":0,"started":77},"input":{"log":{"files":{"truncated":38}}}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"events":{"acked":527884,"batches":4732,"failed":51426,"total":579310},"read":{"bytes":32364,"errors":4},"type":"logstash","write":{"bytes":180629879,"errors":19}},"pipeline":{"clients":0,"events":{"active":0,"filtered":179,"published":527884,"retry":99719,"total":528063},"queue":{"acked":527884}}},"registrar":{"states":{"cleanup":8,"current":38,"update":528063},"writes":{"success":4356,"total":4356}},"system":{"cpu":{"cores":8},"load":{"1":0.66,"15":0.52,"5":0.56,"norm":{"1":0.0825,"15":0.065,"5":0.07}}}}}}
2020-07-22T09:55:36.820+0100    INFO    [monitoring]    log/log.go:154  Uptime: 1h58m39.572210325s
2020-07-22T09:55:36.820+0100    INFO    [monitoring]    log/log.go:131  Stopping metrics logging.
2020-07-22T09:55:36.820+0100    INFO    instance/beat.go:432    filebeat stopped.

Welcome to our community! :smiley:

It looks like something is asking Filebeat to stop. Is there anything in your OS logs that might show anything related?