Filebeat not getting data from all configured sources

Greetings,

we are experiencing the following issue with Filebeat. After we start the Filebeat instance it registers three inputs. Unfortunately it pulls info only from one. Here is the debug log.

018-11-27T11:34:26.895+0200	INFO	[beat]	instance/beat.go:870	Process info	{"system_info": {"process": {"cwd": 
"C:\\Program Files\\Filebeat", "exe": "C:\\Program Files\\Filebeat\\filebeat.exe", "name": "filebeat.exe", "pid": 
37468, "ppid": 1864, "start_time": "2018-11-27T11:34:26.404+0200"}}}

2018-11-27T11:34:26.895+0200	INFO	instance/beat.go:278	Setup Beat: filebeat; Version: 6.5.1

2018-11-27T11:34:26.895+0200	DEBUG	[publish]	pipeline/consumer.go:137	start pipeline event consumer

2018-11-27T11:34:26.895+0200	INFO	[publisher]	pipeline/module.go:110	Beat name: APS01

2018-11-27T11:34:26.895+0200	INFO	instance/beat.go:400	filebeat start running.

2018-11-27T11:34:26.895+0200	INFO	registrar/registrar.go:97	No registry file found under: C:\Program 
Files\Filebeat\data\registry. Creating a new registry file.

2018-11-27T11:34:26.910+0200	INFO	[monitoring]	log/log.go:117	Starting metrics logging every 30s

2018-11-27T11:34:26.912+0200	INFO	registrar/registrar.go:134	Loading registrar data from C:\Program 
Files\Filebeat\data\registry

2018-11-27T11:34:26.912+0200	INFO	registrar/registrar.go:141	States Loaded from registrar: 0

2018-11-27T11:34:26.912+0200	WARN	beater/filebeat.go:374	Filebeat is unable to load the Ingest Node pipelines for the 
configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest 
Node pipelines or are using Logstash pipelines, you can ignore this warning.

2018-11-27T11:34:26.912+0200	INFO	crawler/crawler.go:72	Loading Inputs: 3

2018-11-27T11:34:26.912+0200	INFO	log/input.go:138	Configured paths: [c:\Users\svc_deadline\Desktop\thinkbox_lic.txt]

2018-11-27T11:34:26.912+0200	INFO	input/input.go:114	Starting input of type: log; ID: 17125463488813885808 

2018-11-27T11:34:26.912+0200	INFO	log/input.go:138	Configured paths: [c:\Users\svc_deadline\Desktop\nuke_lic.txt]

2018-11-27T11:34:26.912+0200	INFO	input/input.go:114	Starting input of type: log; ID: 12347386215356025784 

2018-11-27T11:34:26.912+0200	INFO	log/input.go:138	Configured paths: [c:\Program Files (x86)\AFLICS\aflics_inf.txt]

2018-11-27T11:34:26.912+0200	INFO	input/input.go:114	Starting input of type: log; ID: 14009961413226775810 

2018-11-27T11:34:26.912+0200	INFO	crawler/crawler.go:106	Loading and starting Inputs completed. Enabled inputs: 3

2018-11-27T11:34:26.912+0200	INFO	log/harvester.go:254	Harvester started for file: c:\Program Files 
(x86)\AFLICS\aflics_inf.txt

2018-11-27T11:34:26.912+0200	INFO	cfgfile/reload.go:150	Config reloader started

The only source it gets log data from is this one:

"source": "c:\\Program Files (x86)\\AFLICS\\aflics_inf.txt"

When I check the registry file it only has entry for that source as well.

[{"source":"c:\\Program Files (x86)\\AFLICS\\aflics_inf.txt","offset":334,"timestamp":"2018-11-27T13:39:41.170993+02:00","ttl":-1,"type":"log","meta":null,"FileStateOS":{"idxhi":1441792,"idxlo":165334,"vol":4236759987}}]

If I delete the registry file it gets recreated right away but there is only one source present.

Does anyone know what might be the issue?

Thanks in advance!

Here is also the config file which I am using:

filebeat.inputs:
# thinkbox license server
- type: log
  enabled: true
  paths:
    - c:\Users\svc_deadline\Desktop\thinkbox_lic.txt
  fields:
    source_type: license
    application_name: thinkbox_license
  # multiline.pattern: 'Users of ([[:alnum:]-]+):'
  # multiline.negate: true
  # multiline.match: after

# nuke license server
- type: log
  enabled: true
  paths:
    - c:\Users\svc_deadline\Desktop\nuke_lic.txt
  fields:
    source_type: license
    application_name: nuke_license
  # multiline.pattern: '([[:space:]]*)(count|obsolete): ([[:digit:]]+)'
  # multiline.negate: false
  # multiline.match: after

# fume fx licenses
- type: log
  enabled: true
  paths:
    - c:\Program Files (x86)\AFLICS\aflics_inf.txt
  fields:
    source_type: license
    application_name: fumefx_license

#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false

#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging


#============================== Dashboards =====================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here, or by using the `-setup` CLI flag or the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:



#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["10.1.20.45:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]

#============================== Xpack Monitoring ===============================
# filebeat can export internal metrics to a central Elasticsearch monitoring
# cluster.  This requires xpack monitoring to be enabled in Elasticsearch.  The
# reporting is disabled by default.

# Set to true to enable the monitoring reporter.
#xpack.monitoring.enabled: false

# Uncomment to send the metrics to Elasticsearch. Most settings from the
# Elasticsearch output are accepted here as well. Any setting that is not set is
# automatically inherited from the Elasticsearch output configuration, so if you
# have the Elasticsearch output configured, you can simply uncomment the
# following line.
#xpack.monitoring.elasticsearch:

Are you sure that the path config is correct? If the file is not present, Filebeat silently waits for it to show up.

Indeed that was the problem! Two of the files are .log and one is .txt. I guess we messed up on config deployment.

Thanks for pointing that out!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.