Hi,
I am reading the data using logstash from a confluent kafka topic and I am trying to get the complete kafka metadata including header etc into my data. I have tried the below config but still i am unable to see the metadata included in the output. Could you please help here.
I have checked the payload from kafka and this is my sample payload:
{"partition_id":"1","key":{"type":"STRING","data":"xxxxxxxxxxxxxxx"},"headers":[{"name":"original_filename","value":"xxxxxxxxxxxxxx"},{"name":"project","value":"xxxxxx"},{"name":"app_source","value":"xxxxxxx"}],"value":{"type":"JSON","data":{"documents":[{}]}}}
Config:
input {
kafka {
bootstrap_servers => <kafka_server_list_redacted>
topics => ["test_topic"]
group_id => "xxxxxxxxxxxxxx"
security_protocol => "SASL_SSL"
sasl_mechanism => "PLAIN"
sasl_jaas_config => "xxxxxxxxxxxxxxxxxx"
codec => json {}
decorate_events => extended
}
}
filter {
}
output {
elasticsearch {
hosts => <elasticsearch_hosts_redacted>
index => "%{[@metadata][kafka][lc_topic]}-%{+YYYY.MM.dd}"
}
}
When i run this config i am getting only the data present in the "value":{"type":"JSON","data":{"documents":[{}]} but not the other data which i want.