Unable to push filebeat published messages to kafka output

Hi Team,
I am working on a setup where i am trying to push my filebeat read messages to kafka output. though i could see filebeat publishing messages, i couldn't recieve anything in kafka topic. Below is my filebeat.yml file, please review them and mention if i am missing any key field in there.


  • type: log

    Change to true to enable this prospector configuration.

    enabled: true

    Paths that should be crawled and fetched. Glob based paths.

    To fetch all ".log" files from a specific level of subdirectories

    /var/log//.log can be used.

    For each file found under this path, a harvester is started.

    Make sure not file is defined twice as this can lead to unexpected behaviour.


    • /opt/bea/Logwarhouse/Genaccess/access_log*

    Exclude lines. A list of regular expressions to match. It drops the lines that are

    matching any regular expression from the list. The include_lines is called before

    exclude_lines. By default, no lines are dropped.

    exclude_lines: ['pong']

#============================= Filebeat modules ===============================


Glob pattern for configuration loading

path: ${path.config}/modules.d/*.yml

Set to true to enable config reloading

reload.enabled: false

Period on which files under path should be checked for changes

#reload.period: 10s

#================================ Outputs ======================================

Configure what output to use when sending the data collected by the beat.

#------------------------------- Kafka output ----------------------------------

Boolean flag to enable or disable the output module.

enabled: true

The list of Kafka broker addresses from where to fetch the cluster metadata.

The cluster metadata contain the actual Kafka brokers events are published


hosts: ["localhost:9092"]

The Kafka topic used for produced events. The setting can be a format string

using any event field. To set the topic from document type use %{[type]}.

topic: '%{[apache_stov_topic]}'
reachable_only: false

Sets the output compression codec. Must be one of none, snappy and gzip. The

default is gzip.

compression: gzip

The maximum permitted size of JSON-encoded messages. Bigger messages will be

dropped. The default value is 1000000 (bytes). This value should be equal to

or less than the broker's message.max.bytes.

max_message_bytes: 1000000

Any error messages in the filebeat or kafka log?

Please use the </> Button in the editor to format configuration files and logs. Filebeat configuration is sensitive to formatting. Without proper formatting it's very difficult to read configs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.