Create a new indices for each random directory

Hi Team, I have a requirement to create randomly generated directories from scripts to create separate indices i.e /mnt/data/logger/xyz/xyz-1.1.1.1 and /mnt/data/logger/xyz/xyz-1.2.1.1 here xyz-x.x.x.x these are generating programatically.

How can I create seperate indices for each random directory? Im using filebeat to collect the logs and logstash for filter and indexing.

exepectation: xyz-1.1.1.1-{date} and xyz-1.2.1.1-{date}

filebeat.yml:

- type: log
     enabled: true
     paths:
       - /mnt/data/logger/xyz/*/*.log
     fields:
       log_type: "xyz"

logstash:

  else if [fields][log_type] == "xyz" {
          mutate {
            add_field => { "index_name" => "xyz_logs" }
          }
        }  ```

That's simple enough. Parse [log][file][path] using grok to extract the directory name (e.g. see this thread).

However, creating an index per day per directory can be a performance issue in elasticsearch because it can create a large number of small indexes/shards. (An index is stored in one or more shards.)

The documentation recommends using multi-gigabyte shards.

could you please provide the snippet or update the above snippet for reference.
filebeat:

- type: log
     enabled: true
     paths:
       - /mnt/data/logger/xyz/*/*.log
     fields:
       log_type: "xyz"

logstash

        else if [fields][log_type] == "xyz" {
          grok {
             match => { "[log][file][path]" => "%{GREEDYDATA:dir}/%{DATA:[@metadata][dest]}/%{GREEDYDATA}" }
           }
          mutate {
             add_field => { "index_name" => "xyz_logs-%{[@metadata][dest]}" }
           }
         }

Not working with above snippet

Thanks it worked.