Delete old data

I want to delete all data from elastic search that is more than one week old. I have tried a command as below:
curl -XDELETE 'http://localhost:9200/logstash_2015_09_12'. But the data for that day is still displayed in Kibana. The index name in my settings page in Kibana is logstash-*. Am I missing anything?

Also, please suggest the best methodology to set up a job that will remove all data more than one week, assuming that the job will run once a day.

Have you refreshed your index in Kibana?

Use curator.

I concur with @dadoonet. Curator is the tool to use for Logstash indices. The usage documentation for Curator can be found at https://www.elastic.co/guide/en/elasticsearch/client/curator/current/index.html (@dadoonet provided the link to the source code).

And one possible reason you see data for the 12th of September is that Logstash creates datestamps based on UTC time. Depending on your offset from UTC, you could see a little or a lot still for that date in your time zone.

1 Like

I tottally agree with @dadoonet and @theuntergeek.

Another solution would be writing your own cron job to do that for you. This might be longer to integrate thought.

Thanks. This helped.

I have around eight fields in my queries with ELK, viz. bank name, transaction id, timestamp, response code, transaction type, notes, transaction channel, transaction amount. In Kibana I have used the following in different dashboards:
exists:bankname
exists:transactionid
Besides I have different pie charts based on transaction channel ( example: mobile, internet, ATM ), transaction type and response code ( success/failure).

Can you please suggest whether I need to tune something in Elastic search, such that the dashboard gives good performance with large amount of data?

@chinmoyd This question should be its own topic.