Filebeat not getting Kafka logs

Hello all,

I'm trying to set up monitoring for my Kafka setup. I'm using

image: confluentinc/cp-server:7.6.0

and ELK stack with version 7.14. The following is my filebeat.yml

kafka.home: /opt/kafka

filebeat.config.modules:
path: /usr/share/filebeat/modules.d/*.yml

filebeat.inputs:

- input_type: log
paths:
- ${kafka.home}/logs/controller.log*
- ${kafka.home}/logs/server.log*
- ${kafka.home}/logs/state-change.log*
- ${kafka.home}/logs/kafka-.log

multiline.pattern: '^['
multiline.negate: true
multiline.match: after

fields.pipeline: kafka-logs

- input_type: log
paths:
- ${kafka.home}/logs/kafkaServer-gc.log

multiline.pattern: '^\s'
multiline.negate: false
multiline.match: after

include_lines: ['GC pause']

fields.pipeline: kafka-gc-logs

output.elasticsearch:
hosts: ["elasticsearch:9200"]
index: 'kafkalogs-%{+yyyy.MM.dd}'
pipeline: '%{[fields.pipeline]}'

username: "elastic"
password: "changeme"

setup.kibana:
host: "http://kibana:5601"

setup.template.name: "desktop"
setup.template.pattern: "desktop-*"

and kafka.yml

- module: kafka
log:
enabled: true
var.paths:
- "/logs/*.log"

after enabling the module and executing the setup I see no logs in my kibana. When doing "docker logs broker" I can see some WARN logs being generated though and I feel like those should now also show up in my kibana.
I feel like either the confluent Kafka image has its Kafka not in /opt/ or I have some other path set up wrongly.

Any advice would be greatly appreciated :slight_smile: