Hello all,
I'm trying to set up monitoring for my Kafka setup. I'm using
image: confluentinc/cp-server:7.6.0
and ELK stack with version 7.14. The following is my filebeat.yml
kafka.home: /opt/kafka
filebeat.config.modules:
path: /usr/share/filebeat/modules.d/*.ymlfilebeat.inputs:
- input_type: log
paths:
- ${kafka.home}/logs/controller.log*
- ${kafka.home}/logs/server.log*
- ${kafka.home}/logs/state-change.log*
- ${kafka.home}/logs/kafka-.logmultiline.pattern: '^['
multiline.negate: true
multiline.match: afterfields.pipeline: kafka-logs
- input_type: log
paths:
- ${kafka.home}/logs/kafkaServer-gc.logmultiline.pattern: '^\s'
multiline.negate: false
multiline.match: afterinclude_lines: ['GC pause']
fields.pipeline: kafka-gc-logs
output.elasticsearch:
hosts: ["elasticsearch:9200"]
index: 'kafkalogs-%{+yyyy.MM.dd}'
pipeline: '%{[fields.pipeline]}'username: "elastic"
password: "changeme"setup.kibana:
host: "http://kibana:5601"setup.template.name: "desktop"
setup.template.pattern: "desktop-*"
and kafka.yml
- module: kafka
log:
enabled: true
var.paths:
- "/logs/*.log"
after enabling the module and executing the setup I see no logs in my kibana. When doing "docker logs broker" I can see some WARN logs being generated though and I feel like those should now also show up in my kibana.
I feel like either the confluent Kafka image has its Kafka not in /opt/ or I have some other path set up wrongly.
Any advice would be greatly appreciated