How to give specific disk threshold for specific node in a Elastic cluster


I have a multi-node cluster.
I want to allocate only 100 shards to a specific node in the elastic cluster. (But other nodes should be allocated more than 100 shards.)
I have one node that has less disk space than the other nodes. So when data is ingested, that node is getting more than 85% disk usage. I want to eliminate that.

How can I configure this kind of configuration on a specific node?
Can I limit the disk utilization of a specific node? (not the cluster)

Thank you..!

No, not possible, the shard allocation settings are a cluster settings, so they apply to the entire cluster, you can't change the settings for just one node.


To add on this and aswer the second question raised by you;

You can not limit disk utilization of a specific node directly, this is tied to the shard allocation
happening. What you can do is play around with your ILM policy/policies, with the theory of getting smaller indices and shards the auto balancing of shards will better utilize the total disk size across nodes (of the same tier) available.

Keep in mind that shards come with some overhead costs though.

Tuning your cluster (search vs storage) is an ever continuing game of turning the gears and improving a and degrading b :wink:

1 Like

Thank you so much for the replies.

Hi Sholzhauer,
Shards are balanced by ILM policies right now.

I have one hot node whose storage size is 400 GB, and the other nodes are 500GB.
400Gb node is always filled with 90% of the storage, but 500Gb nodes have 65% disk usage.
I want to keep disk usage on the 400GB node at less than 85%. (I cannot increase the disk storage on the 400GB node.)

How can I do this?
Thank you for your support.

As already mentioned, this is not possible, the shard allocation watermark are a cluster setting, not a node setting.

Elasticsearch will always try to balance the shards evenly between the nodes, if you have nodes with different disk sizes, then you will have this issue where one node will reach one the watermarks earlier.

1 Like

Thank you so much for your quick response.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.