Kafka output doesnt use winlogbeat template

I am trying to set up an elk stack with apache kafka as log broker. so here is the situation:

  1. elasticsearch, logstash, zookeeper, kafka and kibana installed with docker compose and each one in different container(you can see the docker-compose.yml below)

  2. kafka listens on port 9092 and recives windows event logs from winlogbeat.

  3. logstash is configured as a consumer of winlogbeat topic in kafka.(you can see pipeline.conf below)

  4. I already put winlogbeat template into elasticsearch using curl command.
    the problemn is that data wont use winlogbeat template and entire event message will appear only in message field.(ill share screenshots below)
    how ever if I use elasticsearch.output and send data straight into elasticsearch everythings are fine.
    and ofcourse data from filebeat with correct filters are fine and I still cant figure it out why winlogbeat data wont use index template.
    here is docker-compose.yml file:

    version: '2'
    services:
    elasticsearch:
    image: elasticsearch:7.5.2
    container_name: elasticsearch
    environment:
    - "discovery.type=single-node"
    - "ES_JAVA_OPTS=-Xms1G -Xmx1G"
    ulimits:
    memlock:
    soft: -1
    hard: -1
    volumes:
    - data08:/usr/share/elasticsearch/data
    ports:
    - "9200:9200"
    networks:
    - elk
    ls01:
    image: logstash:7.5.2
    container_name: ls01
    ports:
    - "5044:5044"
    ulimits:
    memlock:
    soft: -1
    hard: -1
    volumes:
    - /root/elk/ls01/pipeline/:/usr/share/logstash/pipeline/
    - /root/elk/ls01/winlogbeat.template.json:/tmp/winlogbeat.template.json
    networks:
    - elk
    depends_on:
    - elasticsearch
    zoo01:
    image: zookeeper:3.5.6
    container_name: zoo01
    ports:
    - "2181:2181"
    ulimits:
    memlock:
    soft: -1
    hard: -1
    environment:
    ZOO_MY_ID: 1
    ZOO_SERVERS: server.1=0.0.0.0:2888:3888;2181
    networks:
    - elk
    depends_on:
    - elasticsearch
    - ls01
    kafka01:
    build: ./kafka01
    container_name: kafka01
    ports:
    - "9092:9092"
    ulimits:
    memlock:
    soft: -1
    hard: -1
    environment:
    KAFKA_BROKER_NAME: kafka01
    KAFKA_BROKER_ID: 1
    KAFKA_BROKER_PORT: 9092
    REPLICATION_FACTOR: 1
    ADVERTISED_LISTENER: PLAINTEXT://kafka01:9092
    ZOOKEEPER_NAME: zoo01
    KAFKA_ZOOKEEPER_CONNECT: "zoo01"
    KAFKA_CREATE_TOPICS: winlogbeat, filebeat
    KAFKA_HEAP_OPTS: -Xmx1G -Xms1G
    LOG_RETENTION_HOURS: 4
    KAFKA_LOG4J_LOGGERS: "kafka.controller=INFO,kafka.producer.async.DefaultEv entHandler=INFO,state.change.logger=INFO"
    volumes:
    - /root/elk/kafka01/server.properties:/opt/elk/kafka/config/server.propert ies
    networks:
    - elk
    depends_on:
    - elasticsearch
    - ls01
    - zoo01
    kibana:
    image: kibana:7.5.2
    container_name: kibana
    ulimits:
    memlock:
    soft: -1
    hard: -1
    networks:
    - elk
    depends_on:
    - elasticsearch
    - ls01
    - zoo01
    - kafka01
    nginx:
    build: /root/elk/nginx
    container_name: nginx
    volumes:
    - /root/elk/nginx/default.conf:/etc/nginx/conf.d/default.conf
    ports:
    - "80:80"
    networks:
    - elk
    depends_on:
    - elasticsearch
    - ls01
    - zoo01
    - kafka01
    - kibana

    volumes:
    data08:
    driver: local

    networks:
    elk:
    driver: bridge

and here is pipeline.conf belongs to logstash container:

input {
  kafka {
    bootstrap_servers => "kafka01:9092"
    topics => ["winlogbeat","filebeat"]
    decorate_events => true
  }
}
output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    index => "winlogbeat-7.5.2"
    manage_template => true
    template => "/tmp/winlogbeat.template.json"
    template_overwrite => "true"
    codec => json
  }
}

and here is the screenshot of kibana shows that pipeline is working but cant use the winlogbeat template.

in case no one reply to this question. i kinda solved my problem by using redis instead of kafka.
its working well in production so far. I needed elasticsearch in cluster mode and 5 redis and 5 logstash containers.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.