Multiple Log Paths With Multiple Index

Many log files are stored on the system in various locations. I'm attempting to use filestream input type to read log files and then send them straight to Elasticsearch. "Error loading template: error creating template instance: key not found" is what I'm seeing when I try to create an index for every log path on Kibana. Whats wrong with the configuration?
Filebeat version is 8.14.3


filebeat.inputs:
- type: filestream
  id: smx
  json.keys_under_root: true
  json.add_error_key: true
  fields:
    custom-index: smx
  scan_frequency: 30s
  harvester_limit: 100
  close.on_state_change.inactive: 30m
  close.on_state_change.removed: true
  clean_removed: true
  encoding: utf-8
  paths:
    - /home/app/smx/logs/app-info.log*
- type: filestream
  id: client
  json.keys_under_root: true
  json.add_error_key: true
  fields:
    custom-index: client
  scan_frequency: 30s
  harvester_limit: 100
  close.on_state_change.inactive: 30m
  close.on_state_change.removed: true
  clean_removed: true
  encoding: utf-8
  paths:
    - /home/app/client/log/app-info.log*
- type: filestream
  id: anotherapp
  json.keys_under_root: true
  json.add_error_key: true
  fields:
    custom-index: anotherapp
  scan_frequency: 30s
  harvester_limit: 100
  close.on_state_change.inactive: 30m
  close.on_state_change.removed: true
  clean_removed: true
  encoding: utf-8
  paths:
    - /home/app/anotherapp/log/app-info.log*



output.elasticsearch:
  hosts: ["http://myelastic"]
  username: "user"
  password: 'pass'
  index: "%{[fields.custom-index]}"

setup.template.enabled: true
setup.template.overwrite: true
setup.template.name: "%{[fields.custom-index]}"
setup.template.pattern: "%{[fields.custom-index]}-*"

setup.ilm.enabled: true
setup.ilm.policy_name: "7-days@lifecycle"
setup.ilm.rollover_alias: "%{[fields.custom-index]}"

This might be because the setup phase for the template is prior to events being processed. Since the filestream id appears to be equivalent to fields.custom-index you might try referring to that instead.

Can you please show an example how to refer filestream id?

There's nothing wrong with the fields.custom_index reference. I think the problem is in the template setup command, and the availability of the key.

I believe the setup template command must run prior to the evaluation of any data. Does your data differ significantly in which you want to map the available fields differently? Do you really want to apply different templates?

If so, you may be able to refer to those templates statically and sequentially.
If not, you could create a pattern that applies to all indices, such as filestream-*

Thanks for the answer. I need to apply different templates. I couldnt find any example about referring templates statically. Can you please point me to an example?

Single index pattern works but, yeah, for every filestream input type an index should be created.

A different index can still be created with the output.elasticsearch index parameter. That should stay dynamic. I'm wondering if you need different templates for each index. The main reason to have different templates if you need to map the same field names to different types. But the managed templates Elastic provides do a lot of the heavy lifting for you, and your have setup.template.enabled: true

Managed templates work well for us as well. I modified the lines, but no luck.

output.elasticsearch:
  hosts: ["https://myelastic"]
  username: "user"
  password: 'pass'
  index: "%{[inputs.id]}-%{yyyy.MM.dd}"

setup.template.enabled: false
setup.ilm.enabled: true
setup.ilm.policy_name: "7-days@lifecycle"

Could you do?

index: "logs-%{[fields.custom_index]}-%{yyyy.MM.dd}"

Starting with logs* will use the existing stack managed index template

In the output the fields.custom_index value will be available.

Unfortunately, I encountered another error. I apologize for misdirecting you. Custom fields are not necessary for input types. To utilize as dynamic index naming, I add them. In my situation, fields: custom-index: is entirely unnecessary. There were no examples of using the input type id value that I could locate. I simply need different index names for different log files with lifecycle policy ofcourse.

 Exiting: error initializing publisher: unsupported format expression "yyyy.MM.dd" in index 

There must be a plus sign before date format.

index: "logs-%{[fields.custom_index]}-%{+yyyy.MM.dd}"

But after this change error changed to:

"failed to publish events: temporary bulk send failure","service.name":"filebeat"

Pffff too many frustrations for a configuration.