Logstash - how to configure CSV filter for joining 2 CSV files based on a common field (mapping - one to many data) and ingest the data into single index

You will need to reformat the csv1 file, but you can do it using a translate filter.

Use a file input to read csv2 and then use the following filters

    csv { autodetect_column_names => true }
    translate {
        dictionary_path => "/home/user/t.test/csv1.csv"
        field => "faculty_id"
        destination => "faculty_name"
    }

Then you would use a second translate filter that has a file that looks like

faculty_id,reporting_manager
1,R1
2,R2

Alternately, completely changing the format of csv1

faculty_id,JSON
1,"{""faculty_name"": ""AAA"", ""reporting_manager"": ""R1""}"
2,"{""faculty_name"": ""BBB"", ""reporting_manager"": ""R2""}"

then use these filters

    csv { autodetect_column_names => true }
    translate {
        dictionary_path => "/home/user/csv1.csv"
        field => "faculty_id"
        destination => "[@metadata][json]"
    }
    json { source => "[@metadata][json]" }