Process unable to write to log file monitored by filebeat

I'm trying to use filebeat 7.3.0 on Windows Server 2016 to monitor two log files stored a local disk and I'm having trouble with the logging service having trouble writing to the log files while filebeat is monitoring them. These log files are written to from a remote MSSQL SSIS service that runs once per hour.

What is happening is that filebeat is successfully monitoring the files and pulling in events; however, SSIS is only able to update the log files a single time after filebeat is started. After this, it receives the error "The process cannot access the file because it is being used by another process".

Killing filebeat resolves the issue of the SSIS job not being able to write to the file, but then of course filebeat isn't forwarding log entries.

How can I troubleshoot or resolve this issue?

Here is my config:

---
#=========================== Filebeat inputs =============================

filebeat.inputs:
# Input configuration for all-object-modifications-new.json
- type: log
  enabled: true
  tags: ["sql","custom"]
  paths:
    - C:\Logs\SQL\server\all-object-modifications-new.json
  json.keys_under_root: false
  json.ignore_decoding_error: false
  json.add_error_key: true
  encoding: utf-8
  # ignore_older must be greater than close_inactive or disabled ("0").
  ignore_older: 0
  scan_frequency: 20s
  backoff: 1s
  backoff_factor: 2
  # max_backoff should be set to >= backoff and <= scan_frequency.
  max_backoff: 20s
  harvester_limit: 0
  close_inactive: 90m
  close_renamed: false
  close_removed: true
  close_eof: false
  close_timeout: 0
  
# Input configuration for security-audit-new.json
- type: log
  enabled: true
  tags: ["sql","custom"]
  paths:
    - C:\Logs\SQL\server\security-audit-new.json
  json.keys_under_root: false
  json.ignore_decoding_error: false
  json.add_error_key: true
  encoding: utf-8
  # ignore_older must be greater than close_inactive.
  ignore_older: 0
  scan_frequency: 20s
  backoff: 1s
  backoff_factor: 2
  # max_backoff should be set to >= backoff and <= scan_frequency.
  max_backoff: 20s
  harvester_limit: 0
  close_inactive: 24h
  close_renamed: false
  close_removed: true
  close_eof: false
  close_timeout: 0
  
#=========================== Filebeat output =============================

output.console:
  enabled: true
  pretty: true

#output.logstash:
#  enabled: true
#  hosts: ["127.0.0.1:5044"]
#  loadbalance: false
#  ttl: 0
#  timeout: 30
#  slow_start: false
...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.