I want to learn EFK stack for college mini project, not very complex.
It depends almost entirely on what exactly you want to do. If you want to index a dump of all of Wikipedia into Elasticsearch, that will require considerably more powerful hardware than if you're indexing a few dozen blog posts.
You'll want to set the heap size to no more than half of the total RAM in your machine, but exactly how much you need depends on:
- How much data you need to index, and
- How fast you need it to be.
Elasticsearch will run and be able to index several GB of data on a machine with 1GB of RAM, so any modern laptop should be able to run it at the sort of scale most college projects require.
Say a simple server log file. With only 100s of records. Apply visualization in Kibana and similar simple tasks. Not very big and complex projects. I want to learn EFK for college exams and not for any industry production. Then how much RAM and how many cores my processor should have?
Elasticsearch should be able to handle a load as small as hundreds of documents on basically any hardware it can run on at all - given that it seems like you're talking about running Elasticsearch, Kibana, and a browser on the same system, I would be most worried about the resource usage of the browser in this case.
I did a quick test using one of the sample data sets, a simulated web server log and dashboards, included in Kibana, on a virtual machine running Elasticsearch, Kibana, and Firefox on Ubuntu 16.04. With 2GB of RAM and 1 core, it worked fine, but was quite slow. Increasing the resource allocation to 4GB of RAM and 2 cores, it ran significantly faster and was much more pleasant to use. Processors, of course, vary in speed beyond just number of cores, but hopefully that gives you an idea of the sort of hardware you'll need for some basic experiments.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.