Complete newbie, but need urgent cleanup


I have taken over the monitoring / elastic search from a colleague that left the company.
I now see that our D drive is almost reaching its limits and will be full pretty shortly.
I'm a full newbie and need to learn a lot, but this is kind of important.
Can somebody direct me to the correct way to cleanup this location.

Please make the instruction completely for this ElasticSearch N00b, that I am.
The only thing I currently know is that this location is taking 707GB of data: (using TreeSize)

Can I somehow clean up this location?

There isn't much if the data on your cluster is huge. Can you take a back up of your data Back up a cluster’s data | Elasticsearch Guide [7.14] | Elastic and delete the data folder and restore it ?

Hi @TwanVeugelers Welcome to the community, sorry you are joining with an urgent issue.

Another very quick approach would be to Log Into Kibana.

First take a snapshot as @rashmi indicated

DO NOT Delete Manually delete files from the


Directory or you WILL Corrupt your cluster.

Then in the Kibana -> Dev Tools run the following

GET /_cat/indices/*?v&

This ill list the indices by name

GET /_cat/indices/*/?v&s=index:desc

This will list the indices by size in descending order.

You can DELETE any indices by using the following command


DELETE my-index-name

This will clean up space in the proper manner

1 Like

Hi @stephenb @rashmi

Thank you for your replies.
I have this list of indexes. Please let me know if I read this correct.

Lets talk about the big ones, metric beat 2021.04.13, is this file only historical data for that particular date?
I only need data that is at most 7 days old. So is my assumption correct that I could easily:
DELETE metricbeat-7.10.2-2021.04.13-000003
on the DEV Tools? And basically only lose old historical data that nobody will ever look into again!

Sorry, as said, complete newbie.

Hi @TwanVeugelers

Good you got this far.

2nd PLEASE do not post screen shots, they are very hard to read, can not be search nor cut-n-past etc. Please paste in the text next time and format with the </> button

The index That is the start date of that data ..... so it is more than 1 day

I see these 2 so the 04.13 index has nearly 1 month of data


You can actually run this and get the actual min and max dates times.

GET metricbeat-7.10.2-2021.04.13-00003/_search
  "size": 0,
  "aggs": {
    "min_date": {
      "min": {
        "field": "@timestamp",
        "format": "yyyy-MM-dd"
    "max_date": {
      "max": {
        "field": "@timestamp",
        "format": "yyyy-MM-dd"

And Yes if you DELETE it will permanently delete that data and clean up disk space.

Hi @stephenb

Thank you very much.
I have now removed some old data. So for now the pressure is of the kettle (Dutch saying :wink:)
One quick question, using excel and notepad++ I was able to quickly select some 100+ files I want to delete. How would I go about to remove them in 1 go?

Where can I learn more about the language used in DevTools?

many regards,
Twan Veugelers

Assuming you mean indices not files in the directory.

Use notepad to format all the individual DELETE request then you can cut and paste them in into the Kibana Dev Tool and select and highlight them all and press the > arrow and it will execute all the commands serially.

I would do like 10 at a time.

Hi @stephenb.

Yes Indices. That works like a charm, thank you very much for your effort.



This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.