Overwrite index when the new input file is discovered

I have a logstash configuration below reading a large jason file dumped by seperate python code. The ask is to refresh the index with the new data on daily basis. How to clear the old data from index before reading the new data?

input{
	
	file {
		mode => "read"
		path => ["c:/EDM/tools/dremio_dashboard/sys_jobs_recent.json"]
		file_completed_action => "log_and_delete"
		file_completed_log_path => "c:/EDM/tools/dremio_dashboard/archive"
		codec => json
	}
}

output {
 
  stdout {
    codec => rubydebug {}
  }
  elasticsearch {
	hosts => ["http://localhost:9200"]
	index => "dremio_sys_jobs_recent"
  }	
}

Hello,

You need to use rollover index : Rollover | Elasticsearch Guide [8.15] | Elastic
To manage old indexes need to create/review ILM polices.

Thanks!