Filebeat loading whole csv again, if new entries are added

Hi,

I am trying to send csv(remote server) to elasticsearch(local server). For this, I am using filebeat on remote server to send csv to logstash on local server.

logstash listens for beats and sends it to elastic.

This whole setup works, but when there are new values added to csv on remote side, I see all values are sent to elastic again.

For example,

There were 4 rows in csv. I will have 4 rows in elastic.
If csv get updated to 6 rows, I will have 10 rows in elastic and not 6.

Please advise what am I missing.

Without looking at your logs and configuration it is hard to say. But what you are describing it seems to me that the new lines are not appended to your CSV but a new file is created and replaced with the previous file. What is adding new lines to your CSV?

I am opening CSV, updating it, and saving it manually

File editors do not append to existing files. But rather they create a new file, sometimes they even create a temp file with the new contents and then rename it to the original file name.

You should append to your file with e.g. echo "my;new;line" >> my_file.csv

Thanks a lot :slight_smile:

Alternatively, if you really want to update your file manually you can set file_identity to path: Log input | Filebeat Reference [7.12] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.