Filebeat not harvesting after startup

After updating filebeat to 5.3.1 I noticed that it no was harvesting new logdata. Upon startup it harvests the existing files and then it fails to process any new events. For example I create a new and empty file and hook it up in the prospectors section of the config file, echo a few lines into the file and start filebeat: filebeat -path.config /etc/filebeat and in /var/log/filebeat I see:

2017-04-22T14:22:45+02:00 DBG  Disable stderr logging
2017-04-22T14:22:45+02:00 INFO Home path: [/usr/ports/sysutils/beats/filebeat] Config path: [/etc/filebeat] Data path: [/usr/ports/sysutils/beats/filebeat/data] Logs path: [/usr/ports/sysutils/beats/filebeat/logs]
2017-04-22T14:22:45+02:00 INFO Setup Beat: filebeat; Version: 5.3.1
2017-04-22T14:22:45+02:00 DBG  Processors: 
2017-04-22T14:22:45+02:00 DBG  Initializing output plugins
2017-04-22T14:22:45+02:00 INFO Activated console as output plugin.
2017-04-22T14:22:45+02:00 INFO Loading template enabled. Reading template file: /etc/filebeat/filebeat.template.json
2017-04-22T14:22:45+02:00 INFO Loading template enabled for Elasticsearch 2.x. Reading template file: /etc/filebeat/filebeat.template-es2x.json
2017-04-22T14:22:45+02:00 INFO Elasticsearch url: http://localhost:9200
2017-04-22T14:22:45+02:00 INFO Activated elasticsearch as output plugin.
2017-04-22T14:22:45+02:00 DBG  Create output worker
2017-04-22T14:22:45+02:00 DBG  Create output worker
2017-04-22T14:22:45+02:00 DBG  No output is defined to store the topology. The server fields might not be filled.
2017-04-22T14:22:45+02:00 INFO Publisher name: basalt
2017-04-22T14:22:45+02:00 INFO Flush Interval set to: 1s
2017-04-22T14:22:45+02:00 INFO Max Bulk Size set to: 2048
2017-04-22T14:22:45+02:00 DBG  create bulk processing worker (interval=1s, bulk size=2048)
2017-04-22T14:22:45+02:00 INFO Flush Interval set to: 1s
2017-04-22T14:22:45+02:00 INFO Max Bulk Size set to: 50
2017-04-22T14:22:45+02:00 DBG  create bulk processing worker (interval=1s, bulk size=50)
2017-04-22T14:22:45+02:00 ERR Not loading modules. Module directory not found: /usr/ports/sysutils/beats/filebeat/module
2017-04-22T14:22:45+02:00 INFO filebeat start running.
2017-04-22T14:22:45+02:00 INFO Registry file set to: /var/run/filebeat/.filebeat
2017-04-22T14:22:45+02:00 INFO Loading registrar data from /var/run/filebeat/.filebeat
2017-04-22T14:22:45+02:00 INFO States Loaded from registrar: 0
2017-04-22T14:22:45+02:00 INFO Loading Prospectors: 1
2017-04-22T14:22:45+02:00 DBG  File Configs: [/var/log/f.log]
2017-04-22T14:22:45+02:00 DBG  exclude_files: []
2017-04-22T14:22:45+02:00 INFO Prospector with previous states loaded: 0
2017-04-22T14:22:45+02:00 INFO Starting prospector of type: log; id: 2641201998090730361 
2017-04-22T14:22:45+02:00 INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017-04-22T14:22:45+02:00 INFO Metrics logging every 30s
2017-04-22T14:22:45+02:00 INFO Starting Registrar
2017-04-22T14:22:45+02:00 INFO Start sending events to output
2017-04-22T14:22:45+02:00 INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017-04-22T14:22:45+02:00 DBG  Start next scan
2017-04-22T14:22:45+02:00 DBG  Check file for harvesting: /var/log/f.log
2017-04-22T14:22:45+02:00 DBG  Start harvester for new file: /var/log/f.log
2017-04-22T14:22:45+02:00 DBG  New state added for /var/log/f.log
2017-04-22T14:22:45+02:00 DBG  Setting offset for file based on seek: /var/log/f.log
2017-04-22T14:22:45+02:00 DBG  Setting offset for file: /var/log/f.log. Offset: 0
[...]
2017-04-22T14:22:50+02:00 DBG  Flushing spooler because of timeout. Events flushed: 4
2017-04-22T14:22:50+02:00 DBG  Publish: {
[...]
  "input_type": "log",
  "message": "Sat Apr 22 14:22:18 CEST 2017",
  "offset": 30,
  "source": "/var/log/f.log",
  "type": "log"
}
2017-04-22T14:22:50+02:00 DBG  Publish: {
  [...]
  "input_type": "log",
  "message": "Sat Apr 22 14:22:19 CEST 2017",
  "offset": 60,
  "source": "/var/log/f.log",
  "type": "log"
}
2017-04-22T14:22:50+02:00 DBG  Publish: {
[...]
  "input_type": "log",
  "message": "Sat Apr 22 14:22:20 CEST 2017",
  "offset": 90,
  "source": "/var/log/f.log",
  "type": "log"
}
2017-04-22T14:22:50+02:00 DBG  output worker: publish 3 events
2017-04-22T14:22:50+02:00 DBG  output worker: publish 3 events
2017-04-22T14:22:50+02:00 DBG  ES Ping(url=http://localhost:9200, timeout=1m30s)
2017-04-22T14:22:50+02:00 CRIT Unable to publish events to console: <nil>
2017-04-22T14:22:50+02:00 CRIT Unable to publish events to console: <nil>
2017-04-22T14:22:50+02:00 CRIT Unable to publish events to console: <nil>
2017-04-22T14:22:50+02:00 DBG  Ping status code: 200
2017-04-22T14:22:50+02:00 INFO Connected to Elasticsearch version 5.3.1
2017-04-22T14:22:50+02:00 INFO Trying to load template for client: http://localhost:9200
2017-04-22T14:22:50+02:00 DBG  HEAD http://localhost:9200/_template/filebeat  <nil>
2017-04-22T14:22:50+02:00 INFO Template already exists and will not be overwritten.
2017-04-22T14:22:50+02:00 DBG  PublishEvents: 3 events have been  published to elasticsearch in 5.637663ms.
2017-04-22T14:22:50+02:00 DBG  send completed
2017-04-22T14:22:50+02:00 DBG  Shutting down sync publisher
2017-04-22T14:22:51+02:00 DBG  Run prospector
2017-04-22T14:22:51+02:00 DBG  Start next scan
2017-04-22T14:22:51+02:00 DBG  Check file for harvesting: /var/log/f.log
2017-04-22T14:22:51+02:00 DBG  Update existing file for harvesting: /var/log/f.log, offset: 90
2017-04-22T14:22:51+02:00 DBG  Harvester for file is still running: /var/log/f.log
2017-04-22T14:22:51+02:00 DBG  Prospector states cleaned up. Before: 1, After: 1
[...]
2017-04-22T14:23:15+02:00 INFO Non-zero metrics in the last 30s: filebeat.harvester.open_files=1 filebeat.harvester.running=1 filebeat.harvester.started=1 libbeat.es.call_count.PublishEvents=1 libbeat.es.publish.read_bytes=811 libbeat.es.publish.write_bytes=1315 libbeat.es.published_and_acked_events=3 libbeat.publisher.published_events=3
2017-04-22T14:23:15+02:00 DBG  Flushing spooler because of timeout. Events flushed: 0
[...]

At this point the data is printed to the console and pushed into ElasticSearch. So far so good.

Now I echo a few more lines into the logfile (making sure to append and not truncate):

DBG  Update existing file for harvesting: /var/log/f.log, offset: 90
[...]
DBG  Update existing file for harvesting: /var/log/f.log, offset: 120
[...]
DBG  Flushing spooler because of timeout. Events flushed: 0

As you can see it does pick up the new data (offset went from 90 to 120), however it never prints it to the console or publishes it to ES.

Is there any way to further analyse this issue and to figure out why it's not working?

Also, the registry file is created but never updated it seems? [] is all it contains.
NB: I had to remove some parts of the logfile due to reaching the character limit for a post.

For the console output, it could be related to this one here: https://github.com/elastic/beats/issues/4041 This was fixed in master. How did you check that it also didn't end up in ES? Do you have the issue if you disable console output?

That patch appears to have fixed both issues. Perhaps the "not outputting to ES" was unrelated and merely a red herring. Either way, would a 5.3.2 be released shortly as I'm sure others are running into this too.

Not sure it will make it into 5.3.2 but it will be in an upcoming bugfix release.

This topic was automatically closed after 21 days. New replies are no longer allowed.