Retrofitting Filebeat into my ELK stack

I am ingesting 8 different CSV file schemas using 8 logstash pipelines and reading from a Windows file share. IT has been recommending to not read across the file share but use Filebeat.
Not finding just a whole bunch of info in this area.
Here are 2 of the 'inputs' section:

--------------------------------------------------------------

MemoryLeak Input

  • type: log
    enabled: false
    paths:
    • C:\MemoryLeakTest\v03*-memleak*.log
      tags: ["memleak"]
      fields: {log_type: memleak}

--------------------------------------------------------------

AM-300 TaskStat Input

  • type: log
    enabled: false
    paths:
    • C:\MemoryLeakTest\v03*_am-300-taskstat*.log
      tags: ["taskstat-am-300"]
      fields: {log_type: taskstat-am-300}

The output is:
output.logstash:

The Logstash hosts

hosts: [":5044"]

Now, how the heck do I get this to work with my 8 pipeline files?

I put this at the top of one of the pipeline files:

input {
beats {
host => "0.0.0.0"
port => "5044"
client_inactivity_timeout => 180
}
}

My question is, does this look right so far?
And then, how do I split the data across the 8 filters in the 8 pipeline files?

Thanks, Mike

You could use pipeline to pipeline communication

You would be using a "distributor". The if logic in the output section could reference the [fields][log_type] value you have set in filebeat.