Multiple indexes not being created for multiple log files

Hello,

I've been trying to create multiple indexes for multiple log files that get created within a server. I'm quite new to filebeat so wanted to know what I might be doing wrong within the code. The use case requires me to use filebeat to capture and send logs to elasticsearch.

Can someone please help. I'm currently testing the below code with filebeat 8.x

# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: filestream
  id: test-index1-logs
  enabled: true
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - {{ log_path_index1 }}
  fields:
    name: "index1"

- type: filestream
  id: test-index2-logs
  enabled: true
  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - {{ log_path_index2 }}
  fields:
    name: "index2"

# ======================= Elasticsearch template setting =======================
setup.template.enabled: true
setup.template.name: "test-%{[fields.name]}"
setup.template.pattern: "test-%{[fields.name]}-*"
setup.template.settings:
  index.number_of_shards: 1
  #index.codec: best_compression
  #_source.enabled: false

# ---------------------------- Elasticsearch Output ----------------------------
output.elasticsearch:
  enabled: true
  allow_older_versions: true
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]
  index: test-%{[fields.name]}-%{+yyyy.MM.dd}"

Can someone please help in this?

Hi @developer_cloud , Welcome to the Elastic community. The configuration file looks good to me. Any error you getting?

You can try debugging steps and check if you getting any error or not.

You can paste error or warning if you getting any.