Using DataDog's vector to ship logs to ElasticSearch instead of elastic-agent?

DataDog's vector program has a feature which allows you to ship logs directly to

This looks like this could be used as an alternative to elastic-agent.

Does anyone have any experiences to share on using this feature?
How reliable is this?
Is it a viable alternative to elastic-agent for shipping logs to Elastic?

It really depends on your use case and what you want to do with your logs.

Vector is an ETL tool that is more an alternative for Logstash, it has many source, transforms and sinks that you can use to collect logs and send them to Elasticsearch.

It is not an alternative for Elastic Agent because Elastic Agent is not just a log collector, Elastic Agent has many integrations that will use ingest pipelines to parse your data and will make it easier to get useful information from your logs, you do not have to worry about building pipelines to parse your message and you will be able to use the built-in dashboards and alert rules when you use some integration.

You do not have an easier alternative to Elastic Agent, the alternative is to create your own parsers, your own dashboards and your own alert rules and sometimes this can be a lot of work.

Basically using the Elastic Agent you just need to add the integration you want to get your logs, without the Elastic Agent you need to manage everything, the pipeline to parse your data, the index template, the index lifecycle policy, the dashboards, the alert rules etc.

Thanks @leandrojmp , as usual your response is very accurate and useful.

In some cases, I have systems that are producing massive quantities of logs.
In those cases I have seen Elastic-Agent 8.4.2 become a zombie process
and sometimes elastic-agent uses so much CPU that it starves other components on the system.

I have upgraded my system to to Elastic 8.9.0 and my agents to Elastic-Agent 8.9.0, so hopefully
the problems I saw with 8.4.2 are gone.

But at the same time I am looking at alternatives to elastic-agent for shipping logs to Elasticsearch.

I may be able to live without some of the features like custom pipelines and integrations for this particular use case.

If you just want to send the data and build your own parsers than there are plenty of alternatives, which one is better depends entirely on your use case.

If you want to use the integrations and built-in dashboards, than there is no alternative, you need to use the Elastic Agent.

In these cases it is best to use a load balancer to balance the volume of log between multiple instances.

The same behaviour can happen with Logstash or Vector for example.

As another example, I have a high event rate firewall device where I need to balance the events between two Logstash and I also need to use Kafka to keep up the event rate.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.