Filebeat PubSub module throughput issue

I have a single Filebeat 8.4.3 instance running in AWS pulling GCP audit events from a GCP PubSub topic and output to Kafka Topic. Tried various configuration/settings without any success going above 650 messages/s in Filebeat Pubsub module. Even at the following setting with CPU load at 0.5.

However spinning up another Filebeat instance subscribing to the same Pubsub topic, the throughput simply doubled to 1.3k message/s.
My questions are:

  1. Is there a limit on the throughput of a single Filebeat Pubsub Input ?
  2. Can Kafka output queue per module limit the throughput?
# Module: gcp

- module: gcp
    enabled: true

    # Google Cloud project ID.
    var.project_id: MY_PROJECT_ID

    # Google Pub/Sub topic containing firewall logs. Stackdriver must be
    # configured to use this topic as a sink for firewall logs.
    var.topic: gcp-pubsub-topic

    # Google Pub/Sub subscription for the topic. Filebeat will create this
    # subscription if it does not exist.
    var.subscription_name: gcp-audit-pubsub-sub

    # Credentials file for the service account with authorization to read from
    # the subscription.
    var.credentials_file: /etc/filebeat/gcp.json
    #Increase number of Goroutines, hoping increase the throughput
    var.subscription.num_goroutines: 32
    var.subscription.max_outstanding_messages: 120000

The Kafka output setting:

    reachable_only: true
    group_events: 16384

  required_acks: 1
  compression: gzip
  compression_level: 9
  max_message_bytes: 1000000

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.