How to perform join with two or more csv files into one index of elastic search using logstash

I have created two CSV files A.csv and B.csv.
A.csv has the fields id,name,city etc. (10 records)
B.csv has fields id,age,blood_group etc (10 records)
In both the above files id field is common. so can I create an index in elastic search from these two CSV files, so that the index will have only the unique records from both the CSV files.

I am using logstash to insert data into elastic search ,

please suggest me the way to implement this

Here is my config file,

input {
file {
path => "D:\ELK\logstash-6.1.1\bin\input\employee.csv"
start_position => "beginning"
}
file {
path => "D:\ELK\logstash-6.1.1\bin\input\dept.csv"
start_position => "beginning"
}
file {
path => "D:\ELK\logstash-6.1.1\bin\input\employee_dept.csv"
start_position => "beginning"
}
}
filter {
if [path] == "D:\ELK\logstash-6.1.1\bin\input\employee.csv"
{
csv {
separator => ","
columns =>["emp_id","employee_name","designation","department_id","salary"]
}
}
else if [path] == "D:\ELK\logstash-6.1.1\bin\input\dept.csv" or [path] == "D:\ELK\logstash-6.1.1\bin\input\employee_dept.csv"
{
csv {
separator => ","
columns =>["emp_id","employee_name","designation","department_id","salary","department_id","department_name"]
}
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "test"
user => "elastic"
password => "43y2*k1TOqwFUIjYBMb!"
}
stdout {codec => rubydebug}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.