Filebeat modules with Logstash

I've set up a Filebeat -> Logstash -> Kibana workflow that it's been working for the last 2 years.
In the most recent Elastic package versions, new features are available, such as Filebeat modules, Kibana SIEM etc.

Some of the current log parsing can be done with those Filebeat modules, for example Apache, Nginx... But ModSecurity, Postfix and other custom logs need to be parsed with Logstash grok filters.

As far as I know, the Osquery module (for example) needs to be shipped directly to ES, otherwise the Kibana SIEM will not identify the index / fields properly. But Filebeat only accept one output definition, so other logs that depends of Logstash grok parsing will not be processed.

What is the best approach to be able to work with all kind of logs?

You can convert (with limited success) ingest pipelines to logstash code, see

Some things like scripts don't convert. some things can be hand converted, some function just seems to be missing.

In logstash filter:

if [fileset][module] == "apache2" {
  if [fileset][name] == "access" {
     <the converted pipeline for apache access>
  else if [fileset][name] == "error".....

Another option is in logstash logic, if the event is from a filebeat module, add the pipeline name to the elastic output:

For example, in a filter seciton:

if [fileset][module] == "auditd" {

  mutate {
    add_tag => [ "use_ingest" ]


Then in the output section:

if "use_ingest" in [tags] {
  elasticsearch {
      pipeline => "%{[@metadata][beat]}-%{[@metadata][version]}-%{[fileset][module]}-%{[fileset][name]}-pipeline"
    <your current output>

Thanks for the reply I will take a look.
However I think the output section doesn't allow conditionals.

It's the logstash output section. These were mildly sanitized examples from our working pipeline :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.