Deleting index to clear up storage

I have an index called "logstash-events" which holds a few million log entries that are sent from network devices across the network, but a majority of them(90.45%) are useless.

I filtered those out for the future, but I still have 90GB of used data which is very useless and makes things slow. How can I clear this up?

Would I have to delete the index? I'm okay with that, we're still in an early stage and won't lose anything important but notices.

You shouldn't be using a single index for this sort of data, use time based ones and then split them out by different type/sources.

For what you want to do you need to use the delete-by-query plugin - https://www.elastic.co/guide/en/elasticsearch/plugins/current/plugins-delete-by-query.html

1 Like

Alright thanks, got it.

Also changed my config up so indexes are made dynamically everyday so it's easier to manage old documents and delete them if necessary.

index => "logstash_events-%{+YYYY.MM.dd}"
2 Likes