Kafka Module pipeline is not working

I want to send Kafka logs to Elastic through Logstash. I have configured the filebeat.yml file and I'm able to see my logs in Kibana but without the Kafka exported fields that applied using Kafka module pipeline.json

Filebeat Version: 6.7.0

Filebeat.yml

filebeat.config.modules:
  enabled: true
  path: /software/tools/filebeat/modules.d/*.yml

filebeat.modules:
- module: kafka
  log:
    enabled: true
    var.kafka_home: "/opt/kafka"
    var.paths:
      - "/logs/kafka/controller.log*"
      - "/logs/kafka/server.log*"
      - "/logs/kafka/state-change.log*"
      - "/logs/kafka/kafka-*.log*"
    fields:
      event_environment_id: fdt2
    fields_under_root: true
    scan_frequency: 1s
    max_bytes: 52428800

output.logstash:
  hosts: ["elkhost1:9285","elkhost2:9285","elkhost3:9285","elkhost4:9285","elkhost5:9285","elkhost6:9285","elkhost7:9285","elkhost8:9285"]
  loadbalance: true
  ttl: 60
  bulk_max_size: 256
  slow_start: true

Here how is the logs looks like in Kibana:

Here is the Kafka exported fields as per the following documentation:
https://www.elastic.co/guide/en/beats/filebeat/master/exported-fields-kafka.html

So why i can't see these fields? Am I missing any attributes in the filebeat configuration?

Hi,

Given your Filebeat configuration, I'm wondering if the Filebeat index template has been installed properly. Could you please post the results of the following Elasticsearch API call?

GET _cat/templates/f*

Thanks,

Shaunak

Thanks Shaunak for your important reply. Unfortunately, I don't have access to ingest the Kafka pipeline configuration using Elasticsearch API.

So to solve this issue I have added all the logic exist in the below json file inside filter section of my logstash configuration file
<FILE_BEAT_INSTALL>/module/kafka/log/ingest/pipeline.json

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.