Elasticsearch Same document repeating multiple times with filebeat to logstash output

Hello,

Configured Filebeat to parse inputs from log file:

filebeat.inputs:

  • type: log
    enabled: true
    recursive_glob.enabled: true
    close_inactive: 1h
    close_removed: true
    clean_inactive: 5h
    clean_removed: true
    ignore_older: 4h
    paths:
    • "/asdad/*.json"
      json.keys_under_root: true
      json.add_error_key: true
      fields:
      index: mymetricsindex
      hc_type: mymetrics

My json file have 10 unique rows. Elastic search was displaying correctly before from long time. Suddenly after some updates and restart, now each row is repeating 14 times and giving 140 rows in ES query.

encountered a problem with filebeat where the last file imported into elk is continuously imported, endlessly, until a new file is detected by filebeat. the new file is then continuously imported until another new file shows up, ...

Noticed in the registry (/var/lib/filebeat/registry/filebeat/data.json) that the offset for the file is never updated to the correct value, it remains 0. the filebeat logging is showing the offset correctly but does not seem to update in the registry.

Not sure why data is being repeated. For each unique row in json file, 14 documents getting created with same content. Kindly let me know if any leads for this fix?

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.