Include_lines for pubsub input isn't working: no filtering

Hi everyone,

Today I’m encounting some problems using Filebeat with the Pubsub input.

I want to fetch data from a PubSub (from GCP) and then push it in Kafka. All is OK for that.
Nevertheless I need to filter lines to only take those that have a “textPayload” that starts with a date.

I so use this configuration:

filebeat.inputs:
  - type: gcp-pubsub
    project_id: blahblah
    topic: mytopic
    subscription.name: mysub
    credentials_json: {somecredentials}
    include_lines: ['something']
    fields_under_root: true
    fields:
      type_log: GCP
  - type: gcp-pubsub
    project_id: blahblah
    topic: mysecondtopic
    subscription.name: mysecondsub
    credentials_json: {somecredentials}
    fields_under_root: true
    fields:
      type_log: NGINX

filebeat.registry.path: /usr/share/filebeat/data/registry
name: SuperName
max_procs: 1
fields_under_root: true
fields:
  namespace: mynamespace
  topic: TOPIC_NAME_FOR_REFERENCE

output.kafka:
  enabled: true
  hosts: ["my.kafka.com:12345"]
  topic: '%{[topic]}'
  max_message_bytes: 5000000
  compression: gzip
  ssl.enabled: true
  ssl.renegotiation: freely
  ssl.certificate_authorities: ["/path/to/file.pem"]
  required_acks: 1
  partition.round_robin:
    reachable_only: false

logging.level: info
logging.to_stderr: true
logging.metrics.enabled: false


The problem is: even if I change include_lines content ALL lines comes into Kafka. ALL Lines :sad_but_relieved_face: .

Have you any suggestions? Does incluces_lines really works for gcp-pubsub module? Does this module be broken?

I use filebeat 8.17.

Thanks in advance for any help :folded_hands:

Hello @blankoworld

Welcome to the Community!!

As per the documentation :

It has processor for filtering :

processors:
  - drop_event:
      when:
        regexp:
          message: "^DBG:"

I am not sure if because of this it is sending all the data & not considering this parameter include_lines.

Thanks!!