I am currently working with Elasticsearch, Logstash, and Kibana (ELK) alongside Filebeat for log ingestion. I am seeking advice on optimizing CPU and memory utilization in my setup.
- What are some common strategies for optimizing CPU and memory usage in an ELK setup with Filebeat?
2)Are there specific configurations that can help minimize resource utilization during heavy indexing periods?
The new 8.12 release improves the default settings for Filebeat to optimize CPU and memory usage across Filebeat and Elasticsearch. For this reason you may not need to change anything until you start to run into problems and then the changes required may depend on the problem encountered.
Is your architecture Filebeat -> Logstash -> Elasticsearch? Do you need to use Logstash? Can you just have Filebeat talk directly to Elasticsearch?
Are you currently experiencing issues with throughput? Have you considered doing any load/scale testing to see what might need to be optimized in your setup?
This is great news! Do you happen to have links to any documentation for the increased optimization details in 8.12? Currently running 8.9 and we're going through a initiative to increase our elastic performance. I would love any more details to use as an argument to take the time to upgrade to the latest version of elastic and beats.
This blog post covers the changes: Using Elastic Agent Performance Presets in 8.12 | Elastic Blog
It mentions Elastic Agent in the title but everything in the post is also applicable to Beats. These changes are only applicable for Beats -> Elasticsearch though.