Creating dynamic index name with Filebeat based on custom event field fails

I got this working after getting a better understanding of the concept of fields in filebeat and elasticsearch. In my case, the custom field I was referring to is created in the elasticsearch ingest pipeline. Those fields are not yet created/available to filebeat. What I did was use a primitive dissect processor within filebeat.yml to process the log message and extract the field I was interested in. That field is now available to be referenced in the filebeat.yml file. I then used that field name within the index name.... Note, I also still use the elasticsearch ingest pipeline as well to do the real detailed processing.. Here is a snippet of my filebeat.yml file:

Blockquote

filebeat.inputs:
- type: filestream
  id: syslog_filebeat-id
  enabled: true
  prospector.scanner.check_interval: 1s
  paths:
    - /srv/decompress_dir/*/log/syslog
    - /srv/decompress_dir/*/log/syslog.[0-9]
    - /srv/decompress_dir/*/log/syslog.[1-9][0-9]
    - /srv/decompress_dir/*/log/syslog.[1-9][0-9][0-9]

  processors:
    - dissect:
        tokenizer: "%{key1};;;%{key2},%{key3},%{key4},%{ticket_num}"
        field: "message"
        target_prefix: "dissect"

output.elasticsearch:
  hosts: ["1.1.1.1:9200"]
  index: "syslog-case-num-%{[dissect][ticket_num]}"
  pipeline: my_syslog_pipeline

# ================ Elasticsearch template setting ==================
setup.template.settings:
  index.number_of_shards: 1
setup.template.name: "my_syslog"
setup.template.pattern: "syslog-case-*"
setup.ilm.enabled: false

Blockquote