Continuous monitoring of csv files using logstash creating one issue of duplicate records

When we are using start_position parameter as "beginning" and running logstash conf file. Its start monitoring records but as and when I update same csv file(while continuous monitoring) with one new record, All of the records get monitored again and it results in duplicate events. And if I am using "end" for some other logstash conf file as start_position then it doesnot start monitoring file. It stops by printing "pipeline started"..

It sounds like you're not actually updating the original file in place (with e.g. echo "new data" >> file.csv) but rather using a text editor that rewrites the file from scratch. You can't do that if you have start_position => beginning.

I tried using start_position => end and appended a line using echo in existing csv file. But it did not read file at all.