Filebeat Config file

I Need a help with the file beat config:
I'm trying to read the data from a log file whose size remains the same and its modification time is changed every 30 mins.During these 30 mins interval new logs are added in place of the old logs.
Can someone suggest sometime,I tried multiple things but its not reading the newly added logs in the file.
Filebeat - 7.16.2
Running on Windows Server 2019

Filebeat Config:

  • type: filestream
    enabled: true
    paths : 'path of the file'
    tags: ['console_data']

Filebeat Logs:

After the initial read it keeps giving me the same response

2023-04-24T06:19:33.495Z INFO [file_watcher] filestream/fswatch.go:137 Start next scan
2023-04-24T06:19:33.496Z DEBUG [file_watcher] filestream/fswatch.go:204 Found 1 paths
2023-04-24T06:19:33.496Z DEBUG [input.filestream] filestream/prospector.go:164 File C:\Lotus\Domino\Data\IBM_TECHNICAL_SUPPORT\console.log has been updated {"id": "41C27034C04F35E3", "prospector": "file_prospector", "operation": "write", "source_name": "native::153485312-87546-3234102977", "os_id": "153485312-87546-3234102977", "new_path": "C:\Lotus\Domino\Data\IBM_TECHNICAL_SUPPORT\console.log", "old_path": "C:\Lotus\Domino\Data\IBM_TECHNICAL_SUPPORT\console.log"}
2023-04-24T06:19:33.496Z DEBUG [input.filestream] input-logfile/harvester.go:145 Starting harvester for file {"id": "41C27034C04F35E3", "source": "filestream::.global::native::153485312-87546-3234102977"}
2023-04-24T06:19:33.497Z DEBUG [input.filestream] input-logfile/harvester.go:181 Stopped harvester for file {"id": "41C27034C04F35E3", "source": "filestream::.global::native::153485312-87546-3234102977"}
2023-04-24T06:19:43.322Z DEBUG [input.filestream] filestream/filestream.go:131 End of file reached: C:\Lotus\Domino\Data\IBM_TECHNICAL_SUPPORT\console.log; Backoff now. {"id": "41C27034C04F35E3", "source": "filestream::.global::native::153485312-87546-3234102977", "path": "C:\Lotus\Domino\Data\IBM_TECHNICAL_SUPPORT\console.log", "state-id": "native::153485312-87546-3234102977"}

Hi @Dasher ,
Below I've shared filebeat example yml.
Read documentation and try with ignore_older.Any file modification done within 1 hour will always be read and after 1 hour logs altered wont be read.

filebeat.inputs:
- type: log
  enabled: true
  paths:
   - /p/logs/mis/**/*.log
  ignore_older: 1h
  include_lines: 
  - ^[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3}Z ?(.*)
  multiline.type: pattern
  multiline.pattern: ^[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3}Z
  multiline.negate: true
  multiline.match: after
  scan_frequency: 30s
  harvester_limit: 100
  close_inactive: 30m
  close_removed: true
  clean_removed: true
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.ilm.enabled: true
setup.ilm.check_exists: true
setup.ilm.rollover_alias: mis-log
setup.ilm.pattern: '{now/d}-000001'
setup.ilm.overwrite: false
setup.kibana:
  host: http://abc:5601
output.elasticsearch:
  hosts:
  - http://abc:9200
processors:
- add_host_metadata: null
- drop_fields:
    when:
      equals:
        agent.type: filebeat
    fields:
    - agent.hostname
    - agent.id
    - agent.type
    - agent.ephemeral_id
    - agent.version
    - log.offset
    - log.flags
    - input.type
    - ecs.version
    - host.os
    - host.id
    - host.mac
    - host.architecture
monitoring.enabled: true
monitoring.elasticsearch: null


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.