FileBeat not able to read new files

Filebeat : 5.1.1
ELK stack 5.1.1

I have a filebeat running on my server which has following config.

filebeat.prospectors:

- input_type: log

  # Paths that should be crawled and fetched. Glob based paths.
  paths: 
    - <path to my json log file>\*.json
  tags: ["json"]
  
  scan_frequency: 10m
  close_inactive: 10m

 exclude_files: [".gz$"]

  # Type to be published in the 'type' field. 
  document_type: JSON

#================================ JSON =====================================

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  multiline.pattern: ^\{

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  #multiline.negate: false
  multiline.match: after


#================================ Outputs =====================================
#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["server.host:5044"]

My logs are updated every 1 hour and it has a File Rotation that zips the file once it reaches 5MB.

I checked the filebeat registry file and found a file prospector open with an inode no. but with same inode no. there is no file present in my directory. And the new file with new inode number is not picked by the prospector.

Is filebeat blocked by the output? Have you checked filebeat logs for errors/warnings?

I checked for the logs and does not found anything as such.
Last thing I get is this

2017-06-15T10:48:10-05:00 INFO No non-zero metrics in the last 30s
2017-06-15T10:48:10-05:00 DBG  Run prospector
2017-06-15T10:48:10-05:00 DBG  Start next scan
2017-06-15T10:48:10-05:00 DBG  Check file for harvesting: path to json/JSON/dataLoadFramework.json
2017-06-15T10:48:10-05:00 DBG  Update existing file for harvesting: path to json/JSON/dataLoadFramework.json, offset: 1274061
2017-06-15T10:48:10-05:00 DBG  Harvester for file is still running: path to json/JSON/dataLoadFramework.json
2017-06-15T10:48:10-05:00 DBG  Prospector states cleaned up. Before: 1, After: 1
2017-06-15T10:48:40-05:00 INFO No non-zero metrics in the last 30s
2017-06-15T10:49:10-05:00 INFO No non-zero metrics in the last 30s
2017-06-15T10:49:40-05:00 INFO No non-zero metrics in the last 30s
2017-06-15T10:50:10-05:00 INFO No non-zero metrics in the last 30s
.
.
.So On....

Can you elaborate on the file rotation mechanism? It the file rotated / renamed and then zipped? How is the new file created? What do you exactly mean by logs are updated every hour? Is this a bulk write?

  • Could you share your "real" path for the json logs? There is a \ inside which made me suspicious.
  • Also the exclude_files indentation seems to be off.

File Rotation is done by a logger i.e. when the file is of 5mb it will be archived with a new inode number and filename.
ex.:
logFile.json after 5m will become logFile-20-June-2016-01.json.tar.gz.

And new file with name logFile.json is created that will append new logs.

Could you share your "real" path for the json logs? There is a \ inside which made me suspicious.

The Path is proper and cannot share as it is on the internal server.

Also the exclude_files indentation seems to be off.

I have fixed that still no success

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.