Switching to Big Data Architecture

Hi guys. I would like to ask for some advice.

Currently this is what our system architecture looks like.

Current Process:

  1. Store data to Elasticsearch
  2. Transform data using Python
  3. Save the data using Postgres
  4. Visualize data using Tableau

Other Info:
-Elasticsearch servers and Postgre Server are just commodity hardware(laptops with i7,16GB RAM, 1TB HDD)
-We have a lot of commodity hardwares (i7, 16GB+ RAM, 1TB HDD)
-We are planning to buy one server (with 32GB RAM, 4TB of storage... this is upgradeable up to 786GB RAM and 95TB of storage).
-In the long run I don't think we can afford to buy a Server again.

Currently we are having problems especially the data volume is increasing (Approximately 16 Million rows per day)
And we are thinking of implementing Big Data Architecture(Hadoop) to our system.

Given the details, what's the best thing we can do to maintain this kind of system?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.