Duplication while logs are appended in Filebeat

I'm using Filebeat-> Logstash-> ES setup. 7.x
While paring the apache and json logs using, I notice every time the file is appended with new logs I see old logs also getting re-created in elasticsearch. How can I avoid that.
So if my json was this initially. it will write 2 records in ES
sample.json

{line11: line11, key11: value11, key12: value12}
{line21: line21, key21: value21, key22: value22}

Later if 3rd record is appended, it is re-writing the all 3 record in ES while I need only the last one.

{line11: line11, key11: value11, key12: value12}
{line21: line21, key21: value21, key22: value22}
{line31: line31, key31: value31, key32: value32}

filebeat.yaml

- type: log
  enabled: true
  paths:
    - /var/folder1/*
  scan_frequency: 10s
  ignore_older: 4h
  tags: ["jsonlogs"]

logstash.conf

input {
    beats {
            type => "sample"
            port => "5044"
    }
}
filter {  
if "jsonlogs" in [tags]
  {
  json {
    source => "message"
    }
  mutate {
    remove_field => ["message"]
    add_field => { "hostname" => "my-server" }
    }
  }
}
output {
  stdout {
    codec => rubydebug
  }
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "my-index"
  }
}

How can I avoid duplicate on appending the logs ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.