Why Logstash reload the same data many times?


I have an issue with Logstash, it loads the same data many times.

this is my config,

  • Logstash is running as a service on Windows.

    input {
            file {
            		path =>["D:/exploit/tmp/nuitAppELK/annuel/CALCOMPTEURS_ANN-*.log"]
            		type => "xtruncpt-annuel-belgique"
            		start_position => "beginning"
            		sincedb_path => "/dev/null"
    filter {
    output {
            if [type] == "xtruncpt-annuel-belgique" {
            		elasticsearch {
            				hosts => [""]			
            				index => "xtruncpt-annuel-belgique-anvers"
  • The file was updated for the first time around 10pm.

  • As we can see the same data was loaded three times last night.

Can you help me please to fix this issue, I want to load my data just one time every update of the file ?


This setting means that Logstash will not keep track of state when it is restarted, so could lead to duplicate data.

Logstash further assumes files are being appended to and will basically tail them. If files are copied over periodically, they will generally show up as new files, even if they have the same name. How is data added to your files?

1 Like


Every update I delete the files and I create new ones every day at 10pm.

but I have an others files (in the input section I have many files ) witche updated every 30 seconds, in this case It insering the new data to the same file, and I delete the file once per week.

If I delete sincedb_path => "/dev/null" the process works, no problem. Can you please explain to me what does it mean this command ? and if there is an other command to force logstash to load data just one time ?


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.