Why is filebeat reading log files over and over again

Hello all

I know this was posted before, only i never read a satifying answer\solution.
I was advised by my user succes manager to post the problem here

Using a windows10 environment (also tried on Linux)
I am using a simple configuration to read a log file with logbeat.
To start logstash i use the command .\bin\logstash -f .\config\sample.conf
Sample.conf:
input {
beats { port => 5044 }
}
filter {
grok {
match => [
"message", "%{TIMESTAMP_ISO8601:timestamp_string} %{SPACE}%{GREEDYDATA:line}"
]
}
mutate {
remove_field => [message, timestamp_string]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
}
stdout {
codec => rubydebug
}
}

I start filebeat with the command .\filebeat
Filebeat.yml:
filebeat.inputs:

  • type: log
    enabled: true
    paths:
    • ./sample.log
      output.logstash:
      hosts: ["localhost:5044"]

Sample.log contains 14 records

What happenes is that the log file is being read and send over and over again wich will give a lot of duplicates. I found a way to avoid duplicates with the use of a fingerprint but that is not what i want.
I want the logfile only being updated by filebeat when a change happenes in the file and not being read all over again.
Also tried ignore_older: 5s, but it gave the same results.
In the registry file data.json offset is constantly set to 0

question:
Why are basic functions of filebeat not working (what am i missing) ?

Is it reading the log from a local file or a network drive? Is the file bring appended to or copied into place?

The whole setup is on one machine including the log file.
I tried it both ways, copying the file and appending to a file with echo -n "text" >> /{path}/sample.log

Solved.
Its not filebeat itself causing the problem but logstash

Cause:
Buggy logstash 7.5.2

Solution:
Replace logstash 7.5.2 with 7.5.1 or 7.6.1
Or fully upgrade to 7.6.1