Target index by input

I'm trying to setup filebeat so that I have two log sources that end up in different indexes of the target logstash. All involved services' (filebeat, logstash, elastic) versions are 8.12.0.

I've found this question with the solution "you can set the index per input" (as documented here).

However, setting up my input section like this doesn't seem to work; the data seems to get lost altogether. My filebeat config:

filebeat.inputs:
  - type: filestream
    id: "filestream-id"
    enabled: true
    index: "index-id"
    paths:
      - /usr/share/filebeat/logs/*.log

fields_under_root: true

processors:
  - dissect:
      tokenizer: "/usr/share/filebeat/logs/%{service_name}.log"
      field: "log.file.path"
      target_prefix: ""

output.logstash:
  hosts: [ "loghost:5044" ]

However, the index-id that is set up doesn't appear on the elk host. This is the config of the target logstash:

input {
  beats {
    port => 5044
    codec => "json"
  }
}

filter {
  mutate {
    remove_field => ["[event][original]"]
  }
}

output {
  elasticsearch {
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    hosts=> "${ELASTIC_HOSTS}"
    user=> "${ELASTIC_USER}"
    password=> "${ELASTIC_PASSWORD}"
    cacert=> "certs/ca/ca.crt"
  }
  stdout {
    codec => rubydebug
  }
}

Everything works fine if I move the index: ... entry from the filebeat.inputs section to output.logstash; but I want to eventually add more inputs on that host that should end up in another index.

Am I doing something wrong?

Try this

index => "%{[@metadata][index]}-%{+YYYY.MM.dd}"