Elasticsearch shard size

Hi Elasticsearch community,

I am tasked with implementing the elastic stack for the company i work at. I am not sure how many shards i should allocate to each node. The solution is to be used for log management.

We will be having a total of 3 elasticsearch nodes to begin with, and the log intake will grow over time. My company are providing hosting for customers, and they will offer the log management solution as a managed product. This also means that the design needs to take into consideration that the data intake will grow over time, as customers joins in on the solution.

What do you think would be the most clever approach here? How many shards should I allocate?

I will create a new index every day of the month, and keep the logs searchable for a month. After that I think we will move the data to an external location. The customers will each get a unique type identifier.

Customers will be medium sized companies and up to 15.000 of them, as the current highest possible number. I am not sure what this potentially could add up to in regards to data amount. But I guess we are talking TB's pr. day at a late stage.

Any help or insight would be much appreciated.

/Anders

May I suggest you look at the following resources about sizing:

https://www.elastic.co/elasticon/conf/2016/sf/quantitative-cluster-sizing

Thank you, that helped :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.