Include_lines in My Filebeat Module

I have created my module in filebeat and my log can be ingested.

For filebeat.input, there is a feature called "include_lines", which we could only include the lines which matched the regex.

In filebeat module, I tried to add "include_lines" in modules.d/mymodule.yml but the filtering is not working.

If I use filebeat.input, the fields cannot be parsed correct because the log is in my customised format.

May I ask how could I use include_lines in my own filebeat module? Thanks.

Can you share the exact config you were using? Have a look here on how variables are configured in modules: https://www.elastic.co/guide/en/beats/filebeat/current/specify-variable-settings.html

I created my own module (named as "log4j") with based on the Creating a New Filebeat Module | Beats Developer Guide [master] | Elastic

After that I did the followings:

  1. Copied the module into /usr/share/filebeat/module/
  2. Copied the config file log4j.yml into /etc/filebeat/modules.d/

Here is the content of the module config in /etc/filebeat/modules.d/:

- module: log4j
  applogs:
    enabled: true
     var.paths: ["/path/to/log/*log"]
     include_lines: ["SUCCESS"]

I have the following questions:

  1. It is my subjective thinking that "include_lines" is working in modules. (i.e. apache2, Nginx or custom modules).
  2. In official documentation website, "include_lines" is not mentioned within the module section. Instead, "include_lines" is covered in the filebeat.intput section (Log input | Filebeat Reference [8.11] | Elastic)

I doubt how "include_lines" (i.e. filter the logs before ingest into ES) works when using modules because we do have scenario that filter the logs and ingest what we want only.

I have a workaround now:

  1. Use filebeat.input to configure the filebeat to read the log files, and use the "include_lines".

    filebeat.inputs:
     - type: log
       paths:
         - "/path/to/log/*log"
       include_lines: ['SUCCESS']
    
  2. When ingest into ES, I specify the pipeline used.

output.elasticsearch:

hosts: ["elk:8080"]
protocol: "http"
username: "elasticsearch"
password: "password"
index: "logs-jboss-%{+yyyy.MM.dd}"
pipeline: filebeat-6.4.0-log4j-applogs-default

This workaround is working fine but all the input will be processed with same pipeline!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.