How to Resolve Filebeat File Reading and Ingest Target Warning Issues

Hello Team,

I am currently using the Filebeat configuration shown below. Recently, Filebeat has started logging frequent warning messages such as “[points to an already known ingest target]”.

I noticed that if I do not configure the offset and length parameters to handle smaller files, Filebeat ignores those files entirely. To address this, I set the minimum length to ensure all log files, regardless of size, are read. However, after making this change, Filebeat continues to generate warning messages.

Additionally, I have observed another issue: Filebeat does not read files that contain a large amount of data. If I delete such a log file and recreate it after a few minutes, Filebeat then reads the data as expected. Similarly, when a file initially contains a large amount of content, Filebeat does not read it, but if the file has a smaller amount of data, it reads it without any problems.

I am unable to understand why Filebeat behaves this way. Could someone please help me resolve these two issues:

  1. The continuous warning messages.

  2. Filebeat not reading files with large amounts of data unless they are recreated or contain less content.

Thank you for your assistance.
Find the currently configured filebeat config:

- type: filestream

  id: backup

paths:

    - /backup/log/v7*_backup_*.log  #OldLocation

    - /tmp/p_zbackup.sh.log*  #NewLocation

prospector.scanner.fingerprint:

offset: 0

length: 64 

close.on_state_change.renamed: true

close.on_state_change.inactive: 30s

ignore_older: 48h

clean_inactive: 72h

parsers:

    - multiline:

type: pattern 

pattern: '^\d{8} \d{2}:\d{2}:\d{2}|^\w{3} \w{3}\s+\d{1,2} \d{2}:\d{2}:\d{2} \w{3,4} \d{4}'

negate: true

match: after

timeout: 30s

Thanks in advance!

Siva

Hello Team,

Can someone help me to fix this issue?