Filebeat 7.12 to output to multiple kafka topics

Hi,
I'm on filebeat 7.12 (Centos7). How do i output to multiple topics
Pls advise.

filebeat:
prospectors:
- type: log
enabled: true
paths:
-/opt/xssssss/log/xxx.txt
close_rename: true
fields:
document_type: "xxxxxx1test"
tail_files: true
- type: log
enabled: true
paths:
-/var/log/mesg
fields:
document_type: "xxxxxx1topic"
tail_files: true

output.kafka:
enabled: true
hosts: ["x.x.x.x:9092", "x.x.x.x:9092", "x.x.x.x:9092"]
topic: "%{[type]}"
partition.round_robin:
reachable_only: false
required_acks: 1

log error show -> Exiting: error loading config file: yaml: line 37: mapping values are not allowed in this context and line 37 is enabled: true

Well you did not format you code using the </> button so we can not tell if it is valid but the error code indicates it is not.

YAML needs exact indentation ,/ syntax pretty sure yours is not, that is what the error is saying.

Also I noticed for this line it should be single quotes per the documentation example here

topic: '%{[fields.log_topic]}'

My bad. i hope this helps

 filebeat:
   prospectors:
        - type: log
            enabled: true
            paths: 
                -/xx/xx/xxx.txt
            close_rename: true
            fields:
                document_type: "xxxxx1test"
            tail_files: true
        - type: log
            enabled: true
            paths:
                -/var/log/mesg
            fields:
                document_type: "xxxx1topic"   
            tail_files: true

  scan_frequency: 10s
  backoff: 1s
  close_inactive: 1m
  ignore_older: 2h
  clean_inactive: 25h

output.kafka:
  enabled: true

  hosts: ["192.xxx.xxx.120:9092", "192.xxx.xxx.122:9092", "192.xxx.xxx.123:9092"]
  topic: "%{[type]}"
  partition.round_robin:
     reachable_only: false
  required_acks: 1

Apologies I am unclear what you are asking that seems to be only a partial filebeat.yml file and it is not correctly formatted in any way.

I am not sure where you got prospectors from

This command can be used to test the validity of your configuration

./filebeat test config -c ./filebeat.yml

I suggest perhaps taking a closer look at the documentation and starting from the filebeat.yml that comes with the distribution and working .

Here is a filebeat.yml that passes parsing test perhaps it will help.

filebeat.inputs:

    - type: log
      enabled: true
      paths: 
        -/xx/xx/xxx.txt
      close_rename: true
      fields:
        document_type: "xxxxx1test"
      tail_files: true
      scan_frequency: 10s
      backoff: 1s
      close_inactive: 1m
      ignore_older: 2h
      clean_inactive: 25h

    - type: log
      enabled: true
      paths:
       -/var/log/mesg
      fields:
        document_type: "xxxx1topic"  
      tail_files: true
      scan_frequency: 10s
      backoff: 1s
      close_inactive: 1m
      ignore_older: 2h
      clean_inactive: 25h

output.kafka:
  enabled: true

  hosts: ["192.xxx.xxx.120:9092", "192.xxx.xxx.122:9092", "192.xxx.xxx.123:9092"]
  topic: "%{[fields.document_type]}"
  partition.round_robin:
    reachable_only: false
  required_acks: 1
1 Like

Hi Stephen, i have a working config with only single output to a kafka topic and i wish to output to 2 different topics. I picked up some examples from Internet and it claimed this config worked.

Here's the error
Exiting: error loading config file: yaml: line 37: mapping values are not allowed in this context

and line 37 is referred "enabled: true"

Show me your entire exact working config filebeat.yml with 1 output that is working then show me your exact config trying multiple.. That first config you showed would have never worked it was invalid.

That is a simple direct yaml syntax error...

The config you shared has only 32 lines, so you didn't share the full config or you are running another config file for some reason.

Also, prospectors was changed to inputs in version 6.3 or 6.4, I'm not sure if the newer versions still works with the configuration using prospectors.

When checking examples from internet it is always good to look into the official documentation of the version you are using to see if the example still applies, the tools in Elastic Stack moves pretty fast and a lot of things can get deprecated or not work anymore.

As asked, share your working config and the config you are trying to run.

Hi, here's my working copy.
The objective is output to various kafka topic based on different inputs.

filebeat.inputs:
- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
     - /xxx/xxx/xxx.txt

  scan_frequency: 10s
  backoff: 1s
  close_inactive: 1m
  ignore_older: 2h
  clean_inactive: 25h

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: true

  # Period on which files under path should be checked for changes
  reload.period: 10s

# -------------------------------- Kafka Output --------------------------------
  output.kafka:
  enabled: true


  hosts: ["192.xxx.xxx.120:9092", "192.xxx.xxx.122:9092", "192.xxx.xxx.123:9092"]
  topic: "xxxxx1test"

anyone knows? i had been googling and i couldnt find except those with examaples with prospectors

Are you sure you are running the config you shared?

You said that you got this error:

Exiting: error loading config file: yaml: line 37: mapping values are not allowed in this context

The filebeat configuration you shared only have 33 lines, so it can't be this config that is giving you that error.

To output to different kafka topics according to an field is documented, the example configuration that Stephen will do that.

You need first to have a field to filter in your document, you can add a field with this:

      fields:
        kafka_topic: "topic-name"

Then in the Kafka output you will refer to this field this way:

output.kafka:
  hosts: ["kafka-brokers"]
  topic: '%{[fields.kafka_topic]}'

There's nothing else to do, you just to make sure what config you are running, you share a config with 33 lines and you are saying that you got an error at line 37, this does not match.

1 Like

Hi Leandro, its indeed line #37 and i remove default comments that comes with the installation.

Hi Leandro, Thank you for your help. I'm able to output 2 different logs to different topic.
Thank you :slight_smile:

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  fields:
        kafka_topic: "xxxxx1test"
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
     - /xxx/xxx/xxx/xxx/xxxx.txt

  fields:
        kafka_topic: "xxxxx1topic"

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/messages

 

  scan_frequency: 10s
  backoff: 1s
  close_inactive: 1m
  ignore_older: 2h
  clean_inactive: 25h

output.kafka:
  enabled: true
  hosts: ["kafka-brokers"]
  topic: '%{[fields.kafka_topic]}'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.