INFO Non-zero metrics in the last 30s


I'm new to ELK and need to do log forwarding for my Kafka logs. After did all the configuration i just run the below command

./filebeat -e -c /etc/filebeat/filebeat.yml -d "*"

i got response like below

     2017/12/13 05:40:44.328423 beat.go:426: INFO Home path: [/usr/share/filebeat/bin] Config path: [/usr/share/filebeat/bin] Data path: [/usr/share/filebeat/bin/data] Logs path: [/usr/share/filebeat/bin/logs]
2017/12/13 05:40:44.328456 beat.go:453: DBG Beat metadata path: /usr/share/filebeat/bin/data/meta.json
2017/12/13 05:40:44.328513 beat.go:433: INFO Beat UUID: b8d5d832-49ea-4bbc-a3c2-a4ab5c3983ad
2017/12/13 05:40:44.328535 beat.go:192: INFO Setup Beat: filebeat; Version: 6.0.1
2017/12/13 05:40:44.328590 beat.go:199: DBG Initializing output plugins
2017/12/13 05:40:44.328609 processor.go:49: DBG Processors: 
2017/12/13 05:40:44.328695 metrics.go:23: INFO Metrics logging every 30s
2017/12/13 05:40:44.328838 client.go:123: INFO Elasticsearch url: http://hostname:9200
2017/12/13 05:40:44.329069 logger.go:18: DBG start pipeline event consumer
2017/12/13 05:40:44.329110 module.go:80: INFO Beat name: ip-
2017/12/13 05:40:44.329200 modules.go:95: ERR Not loading modules. Module directory not found: /usr/share/filebeat/bin/module
2017/12/13 05:40:44.329325 beat.go:260: INFO filebeat start running.
2017/12/13 05:40:44.329373 registrar.go:88: INFO Registry file set to: /usr/share/filebeat/bin/data/registry
2017/12/13 05:40:44.329426 registrar.go:108: INFO Loading registrar data from /usr/share/filebeat/bin/data/registry
2017/12/13 05:40:44.329462 registrar.go:119: INFO States Loaded from registrar: 0
2017/12/13 05:40:44.329518 crawler.go:44: INFO Loading Prospectors: 1
2017/12/13 05:40:44.329561 reload.go:96: DBG Checking module configs from: /usr/share/filebeat/bin/modules.d/*.yml
2017/12/13 05:40:44.329581 reload.go:110: DBG Number of module configs found: 0
2017/12/13 05:40:44.329591 crawler.go:78: INFO Loading and starting Prospectors completed. Enabled prospectors: 0
2017/12/13 05:40:44.329620 registrar.go:150: INFO Starting Registrar
2017/12/13 05:40:44.329639 reload.go:128: INFO Config reloader started
2017/12/13 05:40:44.329689 reload.go:152: DBG Scan for new config files
2017/12/13 05:40:44.329709 reload.go:171: DBG Number of module configs found: 0
2017/12/13 05:40:44.329720 reload.go:220: INFO Loading of config files completed.
2017/12/13 05:41:14.329275 metrics.go:39: INFO Non-zero metrics in the last 30s: beat.memstats.gc_next=4473924 beat.memstats.memory_alloc=2996288 beat.memstats.memory_total=2996288 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.config.reloads=1 libbeat.output.type=elasticsearch libbeat.pipeline.clients=0 registrar.states.current=0
^C2017/12/13 05:41:16.250269 service.go:33: DBG Received sigterm/sigint, stopping
2017/12/13 05:41:16.250291 filebeat.go:311: INFO Stopping filebeat
2017/12/13 05:41:16.250306 crawler.go:105: INFO Stopping Crawler
2017/12/13 05:41:16.250317 crawler.go:115: INFO Stopping 0 prospectors
2017/12/13 05:41:16.250347 reload.go:223: INFO Dynamic config reloader stopped
2017/12/13 05:41:16.250373 crawler.go:131: INFO Crawler stopped
2017/12/13 05:41:16.250390 registrar.go:210: INFO Stopping Registrar
2017/12/13 05:41:16.250413 registrar.go:165: INFO Ending Registrar
2017/12/13 05:41:16.250429 registrar.go:228: DBG Write registry file: /usr/share/filebeat/bin/data/registry
2017/12/13 05:41:16.252784 registrar.go:253: DBG Registry file updated. 0 states written.
2017/12/13 05:41:16.252934 metrics.go:51: INFO Total non-zero values:  beat.memstats.gc_next=4473924 beat.memstats.memory_alloc=3021528 beat.memstats.memory_total=3021528 filebeat.harvester.open_files=0 filebeat.harvester.running=0 libbeat.config.module.running=0 libbeat.config.reloads=1 libbeat.output.type=elasticsearch libbeat.pipeline.clients=0 registrar.states.current=0 registrar.writes=1
2017/12/13 05:41:16.252957 metrics.go:52: INFO Uptime: 31.931636376s
2017/12/13 05:41:16.252967 beat.go:268: INFO filebeat stopped.


kafka.home: /opt/confluent-dev/confluent-3.3.0

#=========================== Filebeat prospectors =============================


# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

- type: log

  # Paths that should be crawled and fetched. Glob based paths.
      - ${kafka.home}/logs/kafkaServer*

  multiline.pattern: '^\s'
  multiline.negate: false
  multiline.match: after

  include_lines: ['GC pause']
  fields.pipeline: kafka-gc-logs

#============================= Filebeat modules ===============================

  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ========================== "kafka-gc-logs"
setup.template.pattern: "kafka-gc-logs-*"
  index.number_of_shards: 3

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
  # Array of hosts to connect to.
  hosts: ["hostname:9200"]
  index: 'kafka-gc-logs-%{+yyyy.MM.dd}'
  pipeline: '%{[fields.pipeline]}'

Seems that logs are not forwarding to elasticsearch from the files specified in filebeat.yml.

let me know is any thing i missed ?
Why it giving response like this even though there are new logs in files?

It's resolved

Change to true to enable this prospector configuration.

  enabled: true

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.