How to combine text file result?

I have some scheduled batch jobs running automatically during the night and will show the results in separate .txt files. Is there anyway to grab the data from the .txt file and consolidate them to display in kibana? Any tools or reference .conf?

Here is the case:(Windows)

Job Code.txt

job_id=0001,description=Ship data from server to elknode1
job_id=0002,description=Ship data from server to elknode2
job_id=0003,description=Ship data from server to elknode3
job_id=0004,description=Ship data from server to elknode4

Job Status.txt

job_id=0001,result=OK
job_id=0002,result=Error: Msg...
job_id=0003,result=OK
job_id=0004,result=OK

Here is my very basic logstash.conf file I knew what should be written

input{
file{
path => "C:/Job/*.txt"
start_position => "beginning"
}
}
filter{}
output{
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

How can I combine the two into one

Thanks

I think the collate filter is what comes closest to what you need.

I don't really know how to make the filter part. It keeps giving result like this.

This is my filter part in conf file, How can I separate the 3 fields to be seen in kibana

filter{
grok{ match => {"message" => ["JobID: %{NOTSPACE:job_id}","description: %{NOTSPACE:description}","result: %{NOTSPACE:message}"]}
add_field => {
"JobID" => "%{job_id}"
"Description" => "%{description}"
"Message" => "%{message}"
}
}
if [job_id] == "0001" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "create"
}
}
if [job_id] == "0003" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
}
}
if [job_id] == "0002" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
}
}
if [job_id] == "0004" {
aggregate {
task_id => "%{job_id}"
code => "map['time_elasped']=0"
map_action => "update"
end_of_task => true
timeout => 120
}
}
}

Any advice?