I am attempting to have filebeat read json packets from log files in a specific directory and send to elasticsearch directly, without using logstash. I got the filebeat service to start up but keeping the following message displayed in the powershell console:
INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s
{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":359},"total":
{"ticks":718,"value":718},"user":{"ticks":359}},"handles":
{"open":238},"info":{"ephemeral_id":"da200c4c-dff4-4ff5-bdf2-
15048ef0d495","uptime":{"ms":150221}},"memstats":...
Can anyone help with this? the ELK stack is running in docker but does seem to be working when using the http://xxx.xxx.xxx.x:5601/ browser lookup.
The filebeat.yml config seems to be correct too:
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- C:\var\log\*
#- c:\programdata\elasticsearch\logs\*
....
and the output config:
#============================= Filebeat modules===============
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# Period on which files under path should be checked for changes
#reload.period: 10s
#==================== Elasticsearch template setting #
setup.template.settings:
index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["192.xxx.xxx.x:9200"]