Elastic Agent vs Logstash with Filebeat

Hello,

I have a server running several Docker containers (including, for example, Keycloak). I want to send logs from these Docker containers to Elasticsearch.
Currently, for one Docker container I'm using a separate Docker container with Filebeat, which sends logs from the Keycloak container to Elasticsearch.

If I also want to send logs from the other Docker containers (I have about 5 of them in total), it occurred to me that it might be better to use Elastic Agent.

A colleague wants me to compare Elastic Agent vs Logstash running on the Elasticsearch server.
The goal is to avoid unnecessarily burdening the server where the application Docker containers are running, and instead have the log collection handled by something running on the Elasticsearch server.

What is better?

  • Using Elastic Agent on the server where the Docker containers are running, or

  • Running Logstash on the Elasticsearch server (where Elasticsearch itself runs as a Docker container) and having Filebeat send raw logs from the Docker containers to Logstash?

From the perspective of: experience, setup difficulty, management, administration, making changes, etc.

Thank you.

1 Like

Hello and welcome,

To get the logs from your docker containers you need something running on the docker server, Logstash would not be able to get the logs remotely in this case, also running anything else besides Elasticsearch in the same server is not recommended.

I think a better comparison would be between running Filebeat or Elastic Agent, where both would need to run on the Docker server for your containers.

Elastic Agent run a Filebeat process under the hood, so in terms of system requirements they are similar, Elastic Agent may require more resources because it does more things, but this would need to be tested.

The main difference is in the management, Elastic Agent can be centrally managed using Fleet in Kibana, so you would need a Fleet Server, create and configure an Agent Policy and a little more things to configure first, but this allows you to manage the configuration directly from Kibana.

How are you doing the parse of your logs? I do not see any use for Logstash here unless you want to parse the messages on it.

2 Likes

The logs are parsed by Filebeat running on the machine with the Docker containers. My goal is to put as little load as possible on the machine running the Docker containers. However, Logstash is not an application that is lightweight in terms of resource usage — it consumes quite a lot of RAM and CPU, and I only have 5 Docker containers to collect logs from.