Proxy Filebeat module events to Elasticsearch ingest pipeline

Hi,

I am using Filebeat to send custom logs to a Logstash, but I would also like to use prebuilt modules for things like system or audit logs.

By default, Filebeat modules are configured to use Elasticsearch ingest pipelines, but having everything go through Logstash is more manageable. I first tried to convert ingest pipelines to Logstash pipelines, but it's a tedious task and some things are even impossible to port (such as painless scripts for the auditd module).
I then tried to use the elasticsearchoutput with the pipeline option, but Filebeat does not seem to send the name of the Elasticsearch pipeline when configured with the Logstash output, and constructing the pipeline name from the filebeat version and module seems brittle.

My workaround at the moment is to run two Filebeat instances, one only with custom logs (configured to send to Logstash) and the other only with modules (configured to send to Elasticsearch). This is not ideal as I have to manually create that second service, and I have to open both Logstash and Elasticsearch to the rest of the network.

Is there a way to send a Filebeat module event directly to the Elasticsearch ingest pipeline it was supposed to go to, having Logstash act as a proxy?

Thanks,
Thomas

Do you have any thought about this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.