Hi guys!
Do you have any idea why I can't import all the data from csv.file via logstash into Elasticsearch? it always imports the same amount equal to 6873 and should be more than 53k. at runtime, the console displays the imported data but does not get into Elasticsearch. I don't get any errors.
Data
example of one row from csv.file
SKU;Kategoria;Rodzina;Marka;Kod u dostawcy;Kod u dostawcy 2;Nazwa towaru
00000290;1_brands,1_restaurant_equipment,2_brand_rm_gastro,2_food_holding_and_warming_equipment,3_steam_heaters_and_buffets,4_bain_marie_heaters,categoryE7E7163;bemary grzewcze;rm gastro;00000290;BMPD 2120;Bemar elektryczny 2-komorowy, GN 1/1, jezdny z kranem spustowym, 1,4 kW;
code
input{
file {
path => "C:/db.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
csv {
separator => ";"
skip_empty_columns => true
columns => ["SKU","Kategoria","Rodzina","Marka","Kod u dostawcy","Kod u dostawcy2", "Nazwa towaru"]
}
ruby {
code => '
c = event.get("Kategoria")
if c
a = c.split(",")
event.set("Kategoria", a.shift)
a.each { |x| event.set(x.strip, x.strip) }
end
'
}
}
output{
elasticsearch {
hosts => "http://localhost:9200"
index => "products"
user => "elastic"
password => "**"
}
stdout{}
}
when I comment this part of code I'm able to import all text from csv:
ruby {
code => '
c = event.get("Kategoria")
if c
a = c.split(",")
event.set("Kategoria", a.shift)
a.each { |x| event.set(x.strip, x.strip) }
end
'
}
But this part of the code works correctly and does not cause problems if I import 1000 lines into elastic, then all the data are stored fine