I used filebeat to connect Kafka, and I noticed that everytime I updated the log file, the filebeat will send duplicate content to the Kafka. Any help to change that?
Exactly how did you update the file? If you used an editor a new file is generally created behind the scenes and replaced, which could be why it is reread for every modification. Filebeat uses inodes to keep track of files and not just the name and path. Instead make sure you append to the file, e.g. using a script or
echo "data" >> testfile.log.
This is really helpful.
After I used the script like you mention:echo "data" >> testfile.log
I can only read the new add lines from the filebeat.
But I still got duplicate log. That means the filebeat output the same content twice.
Can you please check the filebeat log ? Filebeat publishes metrics in the log files and it could give you a clue.
Default location : /var/log/filebeat/filebeat
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.