Multi pipelines

hi
i have two pipeline that collect csv file from different folder
each folder collect the data to a different index
The file name is the same (Inside the file the information is a bit different)
i have problem that the logstash collect only from one folder

Please post the content of your pipelines.yml, the .conf files and Logstash logs. An abstract description of the problem is not enough to solve this.

hi

pipeline.yml

  • pipeline.id: logstash1
    path.config: "/kibana/logstash-7.6.1/logstash-7.6.1/bin/p1/logstash1.conf"
    pipeline.workers: 3
    pipeline.batch.size: 5
  • pipeline.id: logstash2
    path.config: "/kibana/logstash-7.6.1/logstash-7.6.1/bin/p2/logstash2.conf"
    pipeline.workers: 3
    queue.type: persisted
    pipeline.batch.size: 5

conf

input {
file {
path => "c:/input/*.csv"
start_position => "beginning"

}

}

filter {
csv {
separator => ","
columns => ["time","file type","Product name","Side","board result","Machine NAME","Barcode","Barcode slave","Component Qty","Real Component Qty","NG Amount","NG Qty","Operator","Working Order","module number","location","Matiral NO","NG name","NG result"]
}
mutate {convert => ["Component Qty" , "integer"] }
mutate {convert => ["Real Component Qty" , "integer"] }
mutate {convert => ["NG Amount" , "integer"] }
mutate {convert => ["NG Qty" , "integer"] }
date {
match => [ "time", "yyyyMMdd HHmmss"]
timezone => "UTC"
}
}

output {

elasticsearch {
hosts => ["127.0.0.1:9200"]
user => "elastic"
password => "sLf7r0eAsdKD4XPxoWpO"
index => "aoi_pemtron"

}
stdout {}
}

i dont have error at the cmd window , It just does not import the file

someone ? Plz

Hi @peretz_shlomi,

Do you have the same input config on both pipelines?

Especially curious if you have the same path. You did say "collect csv file from different folder" so I guess not..?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.