After convert csv to ES anyway to update index without change its name

Scenario

  1. use logstash transfer csv to elasticsearch
  2. csv update every week, have to create a new index for it, and index name will be changed, otherwise, the index will have old and new data in it, we only need the lastest cs's data

any way to clean up data from index before logstash input data to ES
My solution:

  1. remove the node
  2. use logstash to transfer csv to ES

any way to config it in logstash?

I'm not aware of any Logstash based way to do this, though would happily be corrected.
Sounds more like a job for the ES curator.

1 Like

Just use an index per week, then Elasticsearch Curator to manage deletion as mentioned by @Kryten.

Thanks, that make sense. any idea to run it as a cronjob?

Thanks

Thanks Guys,
I might need a shell script more than this, we want it automatically.
because I need to 1. remove indice 2. import data per week.

If there is a ES tools to start a cronjob, please let me know, thanks in advance.
@Kryten @warkolm

No, you need to use cron.