How to pull data from kafka to elasticsearch

Hi everyone
I want to send data from beat to kafka after kafka send logstash. Kafka role is message que here. I would like to use two beat but not created index based on elastic.

Logstash

input {
  kafka {
	 bootstrap_servers => "172.28.26.169:9092"
     topics => ["metricbeat"]
	 codec => "json"
  }
}

filter {
  if [@metadata][kafka][topic] == "metricbeat" {
    mutate {
      add_field => { "[es_index]" => "metricbeat2-%{+YYYY.MM.dd}"}
    }
  } else if [@metadata][kafka][topic] == "filebeat" {
    mutate {
      add_field => { "[es_index]" => "filebeat"}
    }
  }
}

output {
  elasticsearch {
    hosts => ["172.28.26.169:9200"]
	index => "%{es_index}"
	workers => 1
  }
}

Hi @khergner,

I do not use the Kafka Logstash input module but as far as I can tell that config looks good. What problems do you have or are you asking advice before implementing this at all?

You need to set decorate_events => true on the kafka input.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.