Parsing growing csv files with Logstash?

I have a use case where I need to parse and index csv files that are periodically collected. The challenge is that the CSV file could grow and get overwritten with more rows for some time until the next CSV with a new filename is created.

Is there any way for Logstash to read from the last row/record when it re-parses the same (but larger) CSV file or will it just need to re-process it and up-version the existing record in Elasticsearch?

That is the default way a file input works if you have a sincedb (and you always have a sincedb -- sincedb_path controls whether it is persisted across restarts).

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.