Hello,
I am new to ELK.
I have to provide architectural solution for archival system in our project.
How our current system is,
We have a service/software which generate site data(Oil industry data), this data is inserted into MSSQL DB.
Table filed are like below,
Time, Field ID, Filed_name, Field_Sub_ID value, extra value, type
To view this data we have provided excel add-in.
User can provide basic filter to this addin (filter like date,time from & to. Filed_id, sub id)
This data was very limited max 1 month .
We can just see or query data, max for 1 month.
Our setup was like,
-
On one system we have actual software running.
-
Other system (Archival) was having MSSQL. Excel tool was on same system.
-
Backup system of Archival.
Our new requirement is,
- We want to store this data for 10 years.
- User may make query for data in rang of 1 year
- We have one more system in between source software system & Archival system. As Archival is moved to other site.
So data will move from Source system toè exchange system toè Archival system.
I am planning to use ELK,
Our software can generate data in jSon file.
Logstash will collect this data & upload to elasticsearch
We will provide basic dashboard in kibana.
We want to show data as realtime as, max 5sec delay is fine.
Data generated per day will be 15 GB
Each second around 120 —200 KB will be generated.
My Queries -
-
Am I considering correct software
-
What should be system configuration to run Elasticsearch (RAM, processor, disk size)
We have Windows Server 2019 Standard ROK (16 core)
3)I understand ELK is opensource & free. If we need support we can purchase support license.
Can experts please share your thoughts on this.
Thanks in advance.
{sorry, I have written big description, just to help experts to understand my question & setup}
Regards,
Ash