How to manage elasticsearch data directory

I am using ELK stack in UAT environment along with the filebeat functionality (installed in various webservers).

While using it, i could see that directory /elasticsearch-5.2.2/data/nodes directory is getting increased drastically which results in crashing the server where elasticsearch is installed. So is there any way to handle such big data automatically . I am very new to elasticsearch. What i want to delete the older data indices automatically in regular intervals.

FYI we’ve renamed ELK to the Elastic Stack, otherwise Beats feels left out :wink:

Why does it crash what is the log?

You can use Elasticsearch Curator to manage indices.

Thanks for your reply. It is being crashed because of 100 % disk utilization as data/nodes directory is getting increased in size quickly. 50 GB is consumed in a day.

Also, Can you please help me to use elasticsearch curator step by step as i am very new to this. Will it manage the indices logs automatically?

The space data takes up on disk will depend on your mappings, and you can often save a lot of space by optimising these. This is discussed in this blog post as well as in the documentation.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.