Harvester started for file but not reading logs

--------------------filebeat.log-----------------------

2019-06-24T12:26:31.687+0530 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":375},"total":{"ticks":890,"time":{"ms":15},"value":890},"user":{"ticks":515,"time":{"ms":15}}},"handles":{"open":218},"info":{"ephemeral_id":"82de4b08-58f5-44c6-ba79-63aaa0947d2a","uptime":{"ms":450131}},"memstats":{"gc_next":4362160,"memory_alloc":2375936,"memory_total":10982776,"rss":-69632}},"filebeat":{"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":4,"events":{"active":0}}},"registrar":{"states":{"current":7}}}}}
2019-06-24T12:26:36.953+0530 INFO log/harvester.go:279 File is inactive: C:\inetpub\logs\LogFiles\W3SVC1\u_ex190620.log. Closing because close_inactive of 5m0s reached.
2019-06-24T12:26:41.947+0530 INFO log/harvester.go:254 Harvester started for file: C:\inetpub\logs\LogFiles\W3SVC1\u_ex190620.log
2019-06-24T12:27:01.701+0530 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":375,"time":{"ms":15}},"total":{"ticks":906,"time":{"ms":31},"value":906},"user":{"ticks":531,"time":{"ms":16}}},"handles":{"open":220},"info":{"ephemeral_id":"82de4b08-58f5-44c6-ba79-63aaa0947d2a","uptime":{"ms":480131}},"memstats":{"gc_next":4362160,"memory_alloc":2670896,"memory_total":11277736,"rss":110592}},"filebeat":{"events":{"added":4,"done":4},"harvester":{"closed":2,"open_files":2,"running":2,"started":2}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":4,"events":{"active":0,"filtered":4,"total":4}}},"registrar":{"states":{"current":7,"update":4},"writes":{"success":4,"total":4}}}}}
2019-06-24T12:27:31.695+0530 INFO [monitoring] log/log.go:144 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":406,"time":{"ms":16}},"total":{"ticks":952,"time":{"ms":31},"value":952},"user":{"ticks":546,"time":{"ms":15}}},"handles":{"open":220},"info":{"ephemeral_id":"82de4b08-58f5-44c6-ba79-63aaa0947d2a","uptime":{"ms":510280}},"memstats":{"gc_next":4362160,"memory_alloc":2825392,"memory_total":11432232,"rss":-16384}},"filebeat":{"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":4,"events":{"active":0}}},"registrar":{"states":{"current":7}}}}}

--------infinite loop

----------------filebeat.yml---------------------
filebeat.config.modules:
path: "${path.config}/modules.d/*.yml"
reload.enabled: false
output.elasticsearch:
hosts:
- "192.168.75.16:9200"
setup.kibana:
host: "192.168.75.16:5601"
setup.template.settings:
index.number_of_shards: 1

------------------iis.yml------------------------
module: iis
access:
enabled: true
var.paths: ["C:\inetpub\logs\LogFiles\W3SVC1\*.log"]
error:
enabled: true

Hello, thanks for reaching out. What are the timestamps and sizes of the files that correspond to this glob? I've seen instances where there are no current logs, and filebeat does not ship entries until new files have been generated?

C:\inetpub\logs\LogFiles\W3SVC1\*.log

I am appending log entries manually in the log file for which harvester has started. but the filebeat logs are still showing "Non-zero metrics in the last 30s". After 5 minutes it is closing the file and opening the same file again

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.