I want to send Kafka logs to Elastic through Logstash. I have configured the filebeat.yml file and I'm able to see my logs in Kibana but without the Kafka exported fields that applied using Kafka module pipeline.json
Filebeat Version: 6.7.0
Filebeat.yml
filebeat.config.modules:
enabled: true
path: /software/tools/filebeat/modules.d/*.yml
filebeat.modules:
- module: kafka
log:
enabled: true
var.kafka_home: "/opt/kafka"
var.paths:
- "/logs/kafka/controller.log*"
- "/logs/kafka/server.log*"
- "/logs/kafka/state-change.log*"
- "/logs/kafka/kafka-*.log*"
fields:
event_environment_id: fdt2
fields_under_root: true
scan_frequency: 1s
max_bytes: 52428800
output.logstash:
hosts: ["elkhost1:9285","elkhost2:9285","elkhost3:9285","elkhost4:9285","elkhost5:9285","elkhost6:9285","elkhost7:9285","elkhost8:9285"]
loadbalance: true
ttl: 60
bulk_max_size: 256
slow_start: true
Here how is the logs looks like in Kibana:
Here is the Kafka exported fields as per the following documentation:
https://www.elastic.co/guide/en/beats/filebeat/master/exported-fields-kafka.html
So why i can't see these fields? Am I missing any attributes in the filebeat configuration?