Problem with filebeat

Hello, I have a problem c filebeat 7.12. The filebeat agent is not working correctly. Initially, 2 sources of logs were configured, data is received from one source, but the second one worked for a month and the data was no longer transmitted to the logstash. Here is my configuration:

# ============================== Filebeat inputs ===============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true
  ignore_older: 48h
  tags: ["ups"]
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- T:\FTP\ftp_3744\UPS\*
    - T:\FTP\ftp_3744\UPS\uk-nmc01*
    - T:\FTP\ftp_3744\UPS\ko-nmc01*
    - T:\FTP\ftp_3744\UPS\*ts-nmc01*
    #- c:\programdata\elasticsearch\logs\*

 - type: log

  # Change to true to enable this input configuration.
  enabled: true
  ignore_older: 48h
  tags: ["ftp"]
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - T:\FTPLOG\*.txt

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

  ### Multiline options

  # Multiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  #multiline.pattern: ^\[
  ##  multiline.type: pattern
  ##multiline.pattern: '^Date'
  ##multiline.negate: true
  ##multiline.match: after

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  #multiline.negate: false

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  #multiline.match: after

# filestream is an experimental input. It is going to replace log input in the future.
- type: filestream

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #prospector.scanner.exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

# ======================= Elasticsearch template setting =======================

problem with logs with FTP tag.
From this directory, only incomplete data is sent once a day, although data is written to the file constantly
Agent log:

2022-02-01T11:30:12.075+0200	INFO	[monitoring]	log/log.go:144	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":14781,"time":{"ms":610}},"total":{"ticks":34781,"time":{"ms":1189},"value":34781},"user":{"ticks":20000,"time":{"ms":579}}},"handles":{"open":206},"info":{"ephemeral_id":"fef6b1f7-daac-427b-895c-43418552ccf9","uptime":{"ms":660286}},"memstats":{"gc_next":22982688,"memory_alloc":20939688,"memory_total":4487014432,"rss":62889984},"runtime":{"goroutines":45}},"filebeat":{"events":{"active":-41,"added":3601,"done":3642},"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":3642,"active":0,"batches":28,"total":3642},"read":{"bytes":168},"write":{"bytes":143210}},"pipeline":{"clients":2,"events":{"active":78,"published":3601,"total":3601},"queue":{"acked":3642}}},"registrar":{"states":{"current":714,"update":3642},"writes":{"success":28,"total":28}}}}}
2022-02-01T11:30:15.095+0200	INFO	log/harvester.go:302	Harvester started for file: T:\FTPLOG\ftp.dtek.com010222.txt
2022-02-01T11:35:12.080+0200	INFO	[monitoring]	log/log.go:144	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20406,"time":{"ms":594}},"total":{"ticks":45812,"time":{"ms":1125},"value":45812},"user":{"ticks":25406,"time":{"ms":531}}},"handles":{"open":206},"info":{"ephemeral_id":"fef6b1f7-daac-427b-895c-43418552ccf9","uptime":{"ms":960287}},"memstats":{"gc_next":23455248,"memory_alloc":14047496,"memory_total":5882207952,"rss":63545344},"runtime":{"goroutines":45}},"filebeat":{"events":{"active":-8,"added":2419,"done":2427},"harvester":{"open_files":2,"running":2}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":2427,"active":0,"batches":30,"total":2427},"read":{"bytes":180},"write":{"bytes":118519}},"pipeline":{"clients":2,"events":{"active":99,"published":2419,"total":2419},"queue":{"acked":2427}}},"registrar":{"states":{"current":715,"update":2427},"writes":{"success":30,"total":30}}}}}
2022-02-01T11:35:19.424+0200	INFO	log/harvester.go:333	File is inactive: T:\FTPLOG\ftp.dtek.com010222.txt. Closing because close_inactive of 5m0s reached.
2022-02-01T11:19:12.055+0200	INFO	[beat]	instance/beat.go:1012	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2022-01-19T22:31:04.1+02:00","name":"server-ftp00","ip":["10.1.1.1"],"kernel_version":"10.0.14393.4886 (rs1_release.220104-1735)","mac":["xxxxx"],"os":{"windows"},"timezone":"EET","timezone_offset_sec":7200,"id":"5c66b805-14a2-43a2-ac19-2a371fd0d0ae"}}}
2022-02-01T11:19:12.055+0200	INFO	[beat]	instance/beat.go:1041	Process info	{"system_info": {"process": {"cwd": "C:\\Windows\\system32", "exe": "C:\\Program Files\\Elastic\\Beats\\7.12.1\\filebeat\\filebeat.exe", "name": "filebeat.exe", "pid": 3284, "ppid": 644, "start_time": "2022-02-01T11:19:11.532+0200"}}}
2022-02-01T11:19:12.055+0200	INFO	instance/beat.go:304	Setup Beat: filebeat; Version: 7.12.1
2022-02-01T11:19:12.057+0200	INFO	[publisher]	pipeline/module.go:113	Beat name: server-ftp00
2022-02-01T11:19:12.066+0200	WARN	beater/filebeat.go:178	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2022-02-01T11:19:12.066+0200	INFO	instance/beat.go:468	filebeat start running.
2022-02-01T11:19:12.066+0200	INFO	[monitoring]	log/log.go:117	Starting metrics logging every 30s
2022-02-01T11:19:12.442+0200	INFO	memlog/store.go:119	Loading data file of 'C:\ProgramData\Elastic\Beats\filebeat\data\registry\filebeat' succeeded. Active transaction id=692720547
2022-02-01T11:19:12.623+0200	INFO	memlog/store.go:124	Finished loading transaction log file for 'C:\ProgramData\Elastic\Beats\filebeat\data\registry\filebeat'. Active transaction id=692732335
2022-02-01T11:19:12.624+0200	WARN	beater/filebeat.go:381	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2022-02-01T11:19:12.628+0200	INFO	[registrar]	registrar/registrar.go:109	States Loaded from registrar: 714
2022-02-01T11:19:12.628+0200	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 3
2022-02-01T11:19:14.496+0200	INFO	log/input.go:157	Configured paths: [T:\FTP\ftp_3744295\UPS\uck-nmc01* T:\FTP\ftp_3744295\UPS\kgo-nmc01* T:\FTP\ftp_3744295\UPS\*ts-nmc01*]
2022-02-01T11:19:14.496+0200	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 12979056705062838685)
2022-02-01T11:19:20.491+0200	INFO	log/input.go:157	Configured paths: [T:\FTPLOG\*.txt]
2022-02-01T11:19:20.491+0200	INFO	[crawler]	beater/crawler.go:141	Starting input (ID: 5616210070077520035)
2022-02-01T11:19:20.499+0200	INFO	[crawler]	beater/crawler.go:108	Loading and starting Inputs completed. Enabled inputs: 2
2022-02-01T11:19:20.499+0200	INFO	cfgfile/reload.go:164	Config reloader started
2022-02-01T11:19:20.499+0200	INFO	cfgfile/reload.go:224	Loading of config files completed.
2022-02-01T11:19:20.524+0200	INFO	log/harvester.go:302	Harvester started for file: T:\FTPLOG\ftp.dtek.com010222.txt
2022-02-01T11:19:21.526+0200	INFO	[publisher_pipeline_output]	pipeline/output.go:143	Connecting to backoff(async(tcp://10.10.10.10:5055))
2022-02-01T11:19:21.526+0200	INFO	[publisher]	pipeline/retry.go:219	retryer: send unwait signal to consumer
2022-02-01T11:19:21.526+0200	INFO	[publisher]	pipeline/retry.go:223	  done
2022-02-01T11:19:21.527+0200	INFO	[publisher_pipeline_output]	pipeline/output.go:151	Connection to backoff(async(tcp://10.10.10.10:5055)) established
2022-02-01T11:19:42.068+0200	INFO	[monitoring]	log/log.go:144	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":2656,"time":{"ms":2656}},"total":{"ticks":9593,"time":{"ms":9593},"value":9593},"user":{"ticks":6937,"time":{"ms":6937}}},"handles":{"open":191},"info":{"ephemeral_id":"fef6b1f7-daac-427b-895c-43418552ccf9","uptime":{"ms":30286}},"memstats":{"gc_next":23142752,"memory_alloc":16715624,"memory_sys":41139768,"memory_total":1368041552,"rss":61394944},"runtime":{"goroutines":40}},"filebeat":{"events":{"active":57,"added":3054,"done":2997},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0},"reloads":1,"scans":1},"output":{"events":{"acked":2483,"active":0,"batches":21,"total":2483},"read":{"bytes":126},"type":"logstash","write":{"bytes":107084}},"pipeline":{"clients":2,"events":{"active":57,"filtered":514,"published":2540,"retry":1042,"total":3054},"queue":{"acked":2483}}},"registrar":{"states":{"current":714,"update":2997},"writes":{"success":535,"total":535}},"system":{"cpu":{"cores":4}}}}}

Could you please share debug logs of the problematic input?

How can I upload debug logs here?

You just copy-paste it or maybe use Pastebin or whatever service you want if the file is too big.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.