May I know if anyone has experience to configure Metricbeat to collect metrics from Kerberised Kafka?
Here is my kafka.yml. I also add Read and Describe operations to my Kafka instances for a user stats.
module: kafka
metricsets:
partition
consumergroup
period: 10s
hosts:
broker1:9092
broker2:9092
broker3:9092
client_id: stats
According to the log, it connected to the brokers seemingly.
2020-01-10T09:28:57.157+1300 INFO kafka/log.go:53 Connected to broker at broker1:9092 (unregistered)
2020-01-10T09:28:57.390+1300 INFO kafka/log.go:53 Closed connection to broker broker1:9092
However, it failed to fetch data for the partition and the consumergroup.
2020-01-10T09:28:57.672+1300 INFO module/wrapper.go:252 Error fetching data for metricset kafka.partition: error in connect: failed to query metadata: EOF
2020-01-10T09:28:57.672+1300 INFO module/wrapper.go:252 Error fetching data for metricset kafka.consumergroup: error in connect: failed to query metadata: EOF
It seems to me not the issue of the module. It looks to me that the issue of Metricbeat collecting metrics from Kerberised Kafka as I've found from other post.
I can query the metadata via the scripts from Kafka using the same client id as in the kafka.yml file of metricbeat. However, metricbeat complained query metadata issue. So it seems to me it's more on metricbeat issue rather than the Kerberised Kafka.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.