I have a problem with filebeat running on Linux, every time I stop the filebeat and then start it, it always read file from the beginning.
The log file uploaded by windows batch every 5 minutes, steps as below:
-
put temporary file
-
replace original file by renaming temporary file
Filebeat read the line from last change correctly, except restart.
Linux version: Red Hat Enterprise Linux Server release 6.9 (Santiago)
Filebeat version: 7.16.2
My filebeat configuration
filebeat.inputs:
- type: filestream
enabled: true
paths:
- /logs/AB*
file_identity.path: ~
encoding: GB2312
parsers:
- multiline:
type: pattern
pattern: '^.*INFO'
negate: true
match: after
flush_pattern: '.*Testing End.*'
max_lines: 10000
output.kafka:
hosts: ["192.168.1.1:9092"]
topic: "filebeat-logs"
enabled: true
max_message_bytes: 10000000
required_acks: 0
Is there any way to stop reading the log files again on restarting filebeat and thus avoiding duplicate logging?