Filebeat output to Kafka, how do I do stack monitoring

I output the log to Kafka and how I add filebeat to the stack monitor. I try the following, and it works well, but on kibana, I still don't see the state of filebeat.

output.kafka:
  hosts: ["192.168.1.190:9092"]
  topic: "%{[kafka_topic]}"
  required_acks: 1
  username: "producer"
  password: "xxx"

logging.level: info
logging.to_files: true
logging.files:
  path: /var/log/filebeat
  name: filebeat.log
  keepfiles: 3
  permissions: 0644

# ES监控,可选功能
monitoring:
  enabled: true
  elasticsearch:
    hosts: ["https://192.168.1.191:9200"]
    metrics.period: 120
    backoff.max: 180
    state.period: 600
    backoff.init: 10
    username: "xxx"
    password: "xxxx"
    ssl.certificate_authorities: /etc/filebeat/ssl/ca.crt

I believe that is for elasticsearch server logs, as in a filebeat instance parsing filebeat logs. There should be a separate section in the stack monitoring to see the beats that are pushing to elasticsearch.

You mean log collection and stack monitoring are two separate parts? Is my configuration correct

So what you have configured will push filebeat metrics to to this ES server https://192.168.1.191:9200. That doesn't mean that the actual parsed logs will be sent to the same ES server. You have it being sent to Kafka, what are you doing with the log data after Kafka?

I can see the monitoring of filebeat from this entrance, but I don't know why it's not in the cluster below. I will send the log to Kafka, how to configure the monitoring of filebeat.

Thanks for your help, I added cluster in filebeat configuration_ After UUID, filebeat can be monitored normally

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.