Multiple inputs to multiple indices


I'm new to the Elastic stack but have tried to learn to use it the last couple of weeks. I have Elasticsearch and Kibana running (I can at least use the DevTools) and I'm using Logstash to retrieve data from an MSSQL database. Right now I'm using some sample data (the goal is to be able to implement this with a similar database at work). My config file is now as follows:

input {
  jdbc {
    jdbc_driver_library => "C:\Program Files\Microsoft JDBC Driver 7.4 for SQL Server\sqljdbc_7.4\enu\mssql-jdbc-7.4.1.jre8.jar"
    jdbc_driver_class => ""
    jdbc_connection_string => "jdbc:sqlserver://localhost:1433;databaseName=AdventureWorks2019"
    jdbc_user => "test_user"
    jdbc_password => "pw1"
    statement => " SELECT * FROM Person.Person where modifieddate > :sql_last_value "
    schedule => "* * * * *"

output {
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "person_person" 
        action => 'update'
        document_id => "%{businessentityid}"
        user => "elastic"
        password => "OI4OZjzp7JjPuv4yIGwE"

With this configuration I get out the table Person.Person (I can access it in DevTools) and every minute all rows that have been updated in MSSQL since the last check (aka where the modifieddate has been changed) gets updated in Elasticsearch. So far so good.

The problem for me now is that I have at least 20 different tables that I want to index. Maybe an index per table isn't the best solution but it's what makes the most sense to me right now. Is there a way to modify my config file to update all these indices at the same time or do I have to have multiple config files "running" (both to retrieve the data the first time and to update it)?

I've found this solution multiple-inputs-on-logstash-jdbc so I get that I can have many inputs but I would like to output to different indices too.

Also I'm not doing any mapping, which I guess happens automatically, but I've understood that's a smart thing to do. Is that something I can do in this config file or do I need to set that up in Elasticsearch before I import the data?

I'm very happy for any guidance I could get, thank you!