Multiple kafka topics using logstash and make index inside output with `.conf` file

I am using Logstash 8.11, and I have problem like subject title

I have consulted a few solutions in other topics, and I still have not solved the problem.
Example old topic: How to pull data data from 2 kafka topics using logstash and index the data in two separate index in elasticsearch with same case for resolve by guyboertje, but it not working now

a few other related issues but using with Filebeat, but I'm not like that, I'm using according to ".conf" file

This is inside my ".conf" file

input {
  kafka {
    bootstrap_servers => "localhost:9092"
    topics => ["a_system_logs", "b_system_logs"]
  }
}
filter {
  mutate {
    add_field => { "es_index" => "logstash_%{[@metadata][kafka][topic]}" }
  }
}	
output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "%{es_index}"
  }
}

It not working. When service send data, it will show:

[2024-01-31T00:32:36,736][WARN ][logstash.outputs.elasticsearch][main][...] Badly formatted index, after interpolation still contains placeholder: [logstash_%{[@metadata][kafka][topic]}]; event: `{"@timestamp"=>2024-01-30T17:32:36.569868Z, "@version"=>"1", "es_index"=>"logstash_%{[@metadata][kafka][topic]}", "message"=>"***my data***"

Is there anyone using ".conf" file, please show me how to solve the problem?

Metadata is not added to events by default. You need to set the decorate_events option.

I can't believe such a small installation step is missing.
I looked some other topics, they didn't even have this step, and I didn't pay attention.
Thanks so much.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.