Dear All, my case is like this. I am processing a file through logstash. But for some reason I want to delete the old records in ES and want to reprocess that file. As I found - I need to keep the setting as - start_position => "beginning" and sincedb_path => "Nul" (I am using Windows 7). But this does not seem to be best way because, if logstash restarts (due to various reasons it can restarts while processing the records), then it again starts from beginnning.
For example- I have 20000 records in my file. I start logstash with start_position => "beginning" and sincedb_path => "Nul". Now, after processing 10000 records connection broken and logstash restarts then, it again processed those 10000 records. So, in result I get 30000 records in ES where first 20000 are basically duplicate records.
Any pointer how to reprocess in this case. If logstash restarts I want it it to reprocess from where it left not from beginning.