Filebeat on windows not reading log file

I am attempting to have filebeat read json packets from log files in a specific directory and send to elasticsearch directly, without using logstash. I got the filebeat service to start up but keeping the following message displayed in the powershell console:

INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        
 {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":359},"total": 

Can anyone help with this? the ELK stack is running in docker but does seem to be working when using the browser lookup.

The filebeat.yml config seems to be correct too:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
 - C:\var\log\*
 #- c:\programdata\elasticsearch\logs\*


and the output config:

 #============================= Filebeat modules===============
 # Glob pattern for configuration loading
 path: ${path.config}/modules.d/*.yml

 # Set to true to enable config reloading
  reload.enabled: false

 # Period on which files under path should be checked for changes
 #reload.period: 10s

#==================== Elasticsearch template setting #

index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false

# Array of hosts to connect to.
 hosts: [""]

Can you add the -d "*" flag to filebeat and check if there is some info during first 30 seconds of execution stating which files are being harvested?

Sounds like it is finding no files at c:\var\log\*.
Could this be a permission issue?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.