I have about 2000 Elastic agents (version 8.9.0) connected to a system with 3 Fleet servers (version 8.9.0).
We have about 20 different agent policies, because the various Elastic agents are sending
slightly different logs, and for certain cases we need to specify specific pipelines to process the logs.
Is it possible to configure the syslog input plugin
using the elastic agent policy?
Logstash input plugins can't be configured in an Elastic Agent policy.
You need to configure in the usual ways, editing Logstash's pipeline files, ( .conf ) or if you have xpack you can use Central Pipeline Management from Kibana.
What advantage does using this processor in the elastic-agent/beats offer?
I am investigating if it is possible to send syslog data from a host directly to elasticsearch without running elastic-agent/beats on an end host.
In earlier versions of elastic-agent (version 8.4.2), for scenarios with very high load scenarios, the elastic-agent would become unresponsize with zombie processes. Maybe this is improved in elastic-agent (version 8.9.1)?
For my use-case, I don't see a lot of value in the syslog processor, because I still need to run the elastic-agent on the end host.
Am I understanding this correctly, or is there something that I am not understanding properly?
The syslog processor detaches the syslog parsing functionality from whatever Filebeat input is being used. Prior to this, only the syslog input was available, which meant you were forced to use either TCP, UDP, or a unix socket as means for consuming syslog messages. Now that the syslog processor is available, any input can be used, it is only a matter of passing the syslog message data to the processor.
This does mean that you must still use either Elastic Agent or Filebeat if you want to use this particular syslog processor.
Does the syslog-processor accept syslog input from over TCP, UDP, unix domain socket?
Would this be done through elastic-agent?
I would like to send messages from syslogd, and then use this processor to ship the messages to Elastic. It's not clear to me from the documentation for this processor where the input comes from.
Elastic Agent processors are lightweight processing components that you can use to parse, filter, transform, and enrich data at the source.
And also this one that tells where you configure the processor.
The processor does not exist alone, it is part of some integration and is executed on the elastic agent that is running that integration.
For example, If you want to receive syslog data using TCP, then you will need to create a Custom TCP logs integrations and configure the syslog-processor in this integration.
The syslog processor itself doesn't handle external input. A filebeat input would sit in front of the chain and would read in syslog messages from some source (tcp, udp, file, etc) and would then pass it to the syslog processor. As Leandro mentioned, in the context of Elastic Agent, the input/processor chain will be part of an existing integration such as the Custom TCP or Custom UDP logs integration, syslog parsing just needs to be enabled when configuring the integration.
Thanks for your replies. The information you have provided is accurate and very useful!
For the syslog processor, would it be OK to add some explanation in:
with what you have mentioned, and cross reference things like filebeat input, Custom TCP or Custom UDP logs integration?
I can even take a whack at submitting a docs PR to that page of what I think clarifies the concepts for
me, and that PR can be evaluated/refined with feedback.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.