I am working on a setup where i am trying to push my filebeat read messages to kafka output. though i could see filebeat publishing messages, i couldn't recieve anything in kafka topic. Below is my filebeat.yml file, please review them and mention if i am missing any key field in there.
Change to true to enable this prospector configuration.
Paths that should be crawled and fetched. Glob based paths.
To fetch all ".log" files from a specific level of subdirectories
/var/log//.log can be used.
For each file found under this path, a harvester is started.
Make sure not file is defined twice as this can lead to unexpected behaviour.
Exclude lines. A list of regular expressions to match. It drops the lines that are
matching any regular expression from the list. The include_lines is called before
exclude_lines. By default, no lines are dropped.
#============================= Filebeat modules ===============================
Glob pattern for configuration loading
Set to true to enable config reloading
Period on which files under path should be checked for changes
#================================ Outputs ======================================
Configure what output to use when sending the data collected by the beat.
#------------------------------- Kafka output ----------------------------------
Boolean flag to enable or disable the output module.
The list of Kafka broker addresses from where to fetch the cluster metadata.
The cluster metadata contain the actual Kafka brokers events are published
The Kafka topic used for produced events. The setting can be a format string
using any event field. To set the topic from document type use
Sets the output compression codec. Must be one of none, snappy and gzip. The
default is gzip.
The maximum permitted size of JSON-encoded messages. Bigger messages will be
dropped. The default value is 1000000 (bytes). This value should be equal to
or less than the broker's message.max.bytes.