Elastic agent replace logstash

Hello,
I've searched the forum and the elastic site, but haven't quite found what I'm looking for. I'm trying to get a better understanding of elastic agents to setup my test environment appropriately.

I want to use multiple integrations with a single elastic agent to ingest/parse multiple log types from a single syslog server eg: a single server hosting nginx, apache, cisco, linux etc.

The goal is to use a single agent instead of logstash on a centralized server:

  1. Is this possible for an agent?
  2. Is logstash being phased out? It isn't mentioned as much on the Elastic site

Any clarification or guidance you can provide is appreciated.

In theory yes, you can have an agent with multiple integrations, but if it can fully replace Logstash depends on how your pipeline looks like, what kind of data you are collecting, how you are collecting the data, if you are doing some enrichment or not etc.

Keep in mind that the parse for integrations are done on Elasticsearch using Ingest pipelines, for example, the parse for a Cisco Integration will be done on Elasticsearch, not on the Elastic Agent.

The Elastic Agent basically gets the logs and send it to Elasticsearch where it will be parsed by some pre-configured ingest pipelines.

You can also customize some parts of those pipelines (but not everything).

There is no word from Elastic about Logstash being deprecated or not, it's been years that Logstash is seem as a tool for more advanced use cases, the recommendation from Elastic I think was always more leaning to use Beats to collect logs and now Elastic Agent.

But for me as a longtime Logstash user it seems that it is in maintenance mode only as it has not evolved in the same pace as the other tools in the stack.

So, if you can do the job with Elastic Agent and Ingest pipelines, than you can skip Logstash.

We are eliminating logstash. Things that were beats to logstash are now agent direct. Syslog inputs for logstash are now agents.

@leandrojmp Thanks for that clarification. That helped to understand both an agent and logstash better. I will be collecting typical log data from multiple source types on a centralized server. TBD on enriching-I'm still learning that concept and benefit from it.

@rugenl Good to know, thanks. Do you know the rough timeline of when logstash will be removed and if any type of support will be available after (eg: maintenance)?

Logstash is not being phased out by Elastic, @rugenl was talking about his personal/company usage of Logstash.

@leandrojmp I misunderstood, thanks
@rugenl I'm curious about your lessons learned from switching to agents especially with syslog. Are the integrations able to capture all the log fields, are multiple log types being collected from a single agent etc?

So far the syslog streams we are processing are pretty unique.

We have multiple types essentially for different tenants of the stack. We just send them to different ports, then configure a corresponding integration for that port. Example, the same agent may process Cisco ASA on a port and PaloAlto on another port. Those are supplied integrations "they just work" :slight_smile:

We have some syslog that doesn't match up with anything other than the Custom TCP/UDP Logs integration. We use the Syslog Parsing option. It captures the entire event, but in some cases the "subject matter experts" can't provide a field layout for the log, so it's just a blob in the message field. I can't see why this wouldn't work for any syslog event.

Sorry for that confusion. It's hard to be both concise and complete in posts :slight_smile:

Our original stack is pretty old, before ECS and before many of the integrations were available. It makes sense to move to what elastic provides instead of using our old non-ECS logstash pipelines.

It's just tactical, fleet was going to be used to manage a lot of agents. If we can manage everything on agents, we don't have to maintain logstash.

@rugenl Lots of great insight, thanks! How are security concerns addressed eg: potential solar winds type of event

You don't need to use multiple agents if you do not want, you can have just one single agent receiving and collecting logs from multiple different sources.

Elastic Agents can be used in basically two main ways, one is a host log collector to get logs and metrics from specific hosts, in this case you need one agent per host in the same way that you would use similar tools like beats itself, datadog, new relic or wazuh.

And it can also be used to receive logs from multiple different sources in the same way you use Logstash or a Syslog server for example.

@leandrojmp I understand, thanks!