What is the maximum indexes I can have in one ES cluster?

I understand one ES shard is one Lucien index and it will consume some amount of CPU/ memory. Is there some limitation on how many ES indexes I can have for one ES cluster?
e.g. if I have thousands of indexes, but most of them the indexes are not active for reading/ writing, does those ES indexes consume CPU and memory as well? Is there some fixed amount of CPU/ memory required for one ES index?

Hi @yuecong.

I think you need to check here-

> Blockquote

Thanks
HadoopHelp

1 Like

All open indices and shards take up some amount of resources. There is also information about indices, shards and mappings stored in the cluster state, and once this gets large it can cause cluster state updates to be slow which can hamper performance and lead to stability problems. The number of indices and shards therefore matter, and it may not be heap exhaustion that causes problems first. Elasticsearch does not scale well with very large number of small shards, and it is possible to crash a cluster by just creating indices and shards even if you do not put any data in it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.