What is system requirement for elasticsearch (elk setup)


I am new to ELK.

I have to provide architectural solution for archival system in our project.

How our current system is,

We have a service/software which generate site data(Oil industry data), this data is inserted into MSSQL DB.

Table filed are like below,

Time, Field ID, Filed_name, Field_Sub_ID value, extra value, type

To view this data we have provided excel add-in.

User can provide basic filter to this addin (filter like date,time from & to. Filed_id, sub id)

This data was very limited max 1 month .

We can just see or query data, max for 1 month.

Our setup was like,

  • On one system we have actual software running.

  • Other system (Archival) was having MSSQL. Excel tool was on same system.

  • Backup system of Archival.

Our new requirement is,

  1. We want to store this data for 10 years.
  2. User may make query for data in rang of 1 year
  3. We have one more system in between source software system & Archival system. As Archival is moved to other site.

So data will move from Source system toè exchange system toè Archival system.

I am planning to use ELK,

Our software can generate data in jSon file.

Logstash will collect this data & upload to elasticsearch

We will provide basic dashboard in kibana.

We want to show data as realtime as, max 5sec delay is fine.

Data generated per day will be 15 GB

Each second around 120 —200 KB will be generated.

My Queries -

  1. Am I considering correct software

  2. What should be system configuration to run Elasticsearch (RAM, processor, disk size)

We have Windows Server 2019 Standard ROK (16 core)

3)I understand ELK is opensource & free. If we need support we can purchase support license.

Can experts please share your thoughts on this.

Thanks in advance.

{sorry, I have written big description, just to help experts to understand my question & setup}



Welcome to our community! :smiley:

It sounds like Elasticsearch would be suitable for this use case. However it's difficult to provide sizing suggestions as we don't know how much data you have.
I would start with a single node of Elasticsearch with 4GB of heap, and see how that performs.

You can also use Elastic Cloud, which lets you scale dynamically.

Hello Mark,
Thanks for reply.

  • Data generated per day will be 15 GB
  • We want to store data for 10 years
  • and 1 year data should be available for queries
  • remaining data we can load as needed

We can not use cloud as we are running in private network.
So we will install ELK stack on our system

  • Do we have any documentation, which describe recommended configuration in private setup.

Thanks & regards

If you're looking at a total of <200GB, then you can easily do that on a single node. You should really consider 3 nodes for redundancy though, but they could start with 8GB of heap.

Hello Mark
Thanks for your reply.

Total data per year will be 5-6 TB. (15GB Per Day, 450GB, 5400GB)

We want to store this data for 10 yrs.
We will make a query for a 1yr.
So remaining 9yr data can be loaded if we want to make a query.

What is recommended setup & architecture in this case.

Thanks & regards,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.