How to get kafka metadata from a confluent kafka topic

Hi,

I am reading the data using logstash from a confluent kafka topic and I am trying to get the complete kafka metadata including header etc into my data. I have tried the below config but still i am unable to see the metadata included in the output. Could you please help here.

I have checked the payload from kafka and this is my sample payload:

{"partition_id":"1","key":{"type":"STRING","data":"xxxxxxxxxxxxxxx"},"headers":[{"name":"original_filename","value":"xxxxxxxxxxxxxx"},{"name":"project","value":"xxxxxx"},{"name":"app_source","value":"xxxxxxx"}],"value":{"type":"JSON","data":{"documents":[{}]}}}

Config:

input {
  kafka {
    bootstrap_servers => <kafka_server_list_redacted>
    topics => ["test_topic"]
    group_id => "xxxxxxxxxxxxxx"
    security_protocol => "SASL_SSL"
    sasl_mechanism => "PLAIN"
    sasl_jaas_config => "xxxxxxxxxxxxxxxxxx"
    codec => json {}
   decorate_events => extended
  }
}
filter {
}
output {
  elasticsearch {
    hosts => <elasticsearch_hosts_redacted>
    index => "%{[@metadata][kafka][lc_topic]}-%{+YYYY.MM.dd}"
  }
}

When i run this config i am getting only the data present in the "value":{"type":"JSON","data":{"documents":[{}]} but not the other data which i want.

I cannot test it, but my understanding from the code is that if you set decorate_events to extended then everything gets set in the [@metadata][kafka] field. Almost universally, outputs don’t do anything with [@metadata] and the elasticsearch output will not send it to elasticsearch. If you want to get that data into elasticsearch you could try

mutate { rename { "[@metadata][kafka]" => "[kafka_metadata]" } }

Thankyou So much @Badger . It worked and now i am able to see the metadata in the “kafka_metadata” object.