Import csv files to two different index

hello

I want to import two different CSV file types into two different index.

What is the right way to do it ?

you need to ensure the output sends to two indexes. So have to tag the CSV files and then ensure each goes to its own indexes

something of below style.. You could mutate your fields to match an index-style and then output it to that

# pipelines.yml
- pipeline.id: employees_index
  path.config: "pipelines/employees_index.conf"
# employees_index.conf
input {
    file {
        path => "/tmp/csv/employees_it.csv"
        tags => ["employees_it"]
    }
    file {
        path => "/tmp/csv/employees_field.csv"
        tags => ["employees_field"]
    }
}

filter {
  if "it_employees" in [tags] {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "employees_it" }
    }
  } else if "field_employees" in [tags] {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "employees_field" }
    }
  } else {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "employees_unknown" }
    }
  }
  csv {
      separator => ","
    }
}

output {
  elasticsearch {
    hosts => "http://localhost:9200"
    user => "elastic"
    password => "somepassword"
    index => "%{[@metadata][index_prefix]}-%{+YYYY.MM}"
  }
}

hi
Thanks for the help

Attached is my settings file , can you help me with the filter?
all the file in c:\input goes to index aoi_false
all the file in c:\inputppm gose to index aoi_ppm

the logstash.conf file :
input {
file {
path => "c:/input/.csv"
tags => ["aoi_false"]
start_position => "beginning"
}
file {
path => "c:/inputppm/
.csv"
tags => ["aoi_ppm"]
start_position => "beginning"
}

}

filter {
csv {
separator => ","
columns => ["time","file type","Product name","Side","board result","Machine NAME","Barcode","Barcode slave","Component Qty","Real Component Qty","NG Amount","NG Qty","Operator","Working Order","module number","location","Matiral NO","NG name","NG result"]
}
mutate {convert => ["Component Qty" , "integer"] }
mutate {convert => ["Real Component Qty" , "integer"] }
mutate {convert => ["NG Amount" , "integer"] }
mutate {convert => ["NG Qty" , "integer"] }
date {
match => [ "time", "yyyyMMdd HHmmss"]
timezone => "UTC"
}
}

output {

elasticsearch {
hosts => ["127.0.0.1:9200"]
user => "elastic"
password => "sLf7r0eAsdKD4XPxoWpO"
index => "aoi_false"

}
stdout {}
}

Please try (PS: I haven't tested this as I don't have sample data. You may need to tweak here and there)

input {
  file {
    path => "c:/input/*.csv"
    tags => ["aoi_false"]
    start_position => "beginning"
  }
  file {
    path => "c:/inputppm/*.csv"
    tags => ["aoi_ppm"]
    start_position => "beginning"
  }
}

filter {
  csv {
    separator => ","
    columns => ["time","file type","Product name","Side","board result","Machine NAME","Barcode","Barcode slave","Component Qty","Real Component Qty","NG Amount","NG Qty","Operator","Working Order","module number","location","Matiral NO","NG name","NG result"]
  }
  mutate {convert => ["Component Qty" , "integer"] }
  mutate {convert => ["Real Component Qty" , "integer"] }
  mutate {convert => ["NG Amount" , "integer"] }
  mutate {convert => ["NG Qty" , "integer"] }
  date {
    match => [ "time", "yyyyMMdd HHmmss"]
    timezone => "UTC"
  }
  
  if "aoi_false" in [tags] {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "aoi_false" }
    }
  } else if "aoi_ppm" in [tags] {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "aoi_ppm" }
    }
  } else {
    mutate {
      add_field => { "[@metadata][index_prefix]" => "aoi_unknown" }
    }
  }  
  
}

output {
  elasticsearch {
  hosts => ["127.0.0.1:9200"]
  user => "elastic"
  password => "sLf7r0eAsdKD4XPxoWpO"
  index => "%{[@metadata][index_prefix]}-%{+YYYY.MM}"
  }
}

pipelines.yml - What exactly do I need to change?

the two config file insaid the config directory and the pipelines.yml olso.

in the pipelines.yml i update

  • pipelines.id: logstash1
    path.config: "logstash1"
  • pipelines.id: logstash2
    path.config: "logstash2"

I'm now stuck running the software logstash
I enter the command:
logstash --path.settings c:/kibana/logstash/config/

i get error " failed to read pipelines yaml file