Create Separate index for each logfile

Hello ppl,

I have a Filebeat > Logstash > Elasticsearch set up in my environment where we are collecting logs data by installing Filebeat and forwarding the data via logstash input. However, we have a peculiar situation where we have 3 different apps with 3 different logs hosted on the same server.
I have configured the Filebeat.yml to send logs from these 3 logs under the filebeat.inputs field.
My question is, is it possible to create separate indexes for each of these logs in Elastic?
I tried some of the below settings in the logstash input pipeline but it didnt work and I keep getting an error when I start logstash.

input plugin:

input {
beats {
port => 5058
path => "H:\LogFiles\ABC\abc.log"
tags => "ABC"
tags => "hostname"
}
beats {
port => 5058
path => "H:\LogFiles\PQR\pqr.log"
tags => "PQR"
tags => "hostname"
}
beats {
port => 5058
path => "H:\LogFiles\XYZ\xyz.log*"
tags => "XYZ"
tags => "hostname"
}
}

And, here's my filebeat.input setting in the yml file:

============================== Filebeat inputs ===============================

filebeat.inputs:

  • type: filestream
    enabled: true
    id: abc
    paths:

    • H:\LogFiles\ABC\abc.log
  • type: filestream
    enabled: true
    id: pqr
    paths:

    • H:\LogFiles\PQR\pqr.log
  • type: filestream
    enabled: true
    id: xyz
    paths:

    • H:\LogFiles\XYZ\xyz.log

=======================================

Thanks,
Nitesh