i wanted to load data from logsatsh into elasticsearch , this is a sample of my data
product-name,price,number_of_customers,general_feddback,promotion_period,percentage_promotion,number_of_customers_after_promotion,categorie_produitTaillefine aux fruits 0% Fraise - Danone - 500 g (4 x 125 g),556.26,481018,3,10-10-2016,22,516292.653333333,yaourt
Activia Fibre - Danone - 171,5 g (150 g+21,5 g),612.98,961714,2,4-8-2016,32,1064296.82666667,yaourt
Taillefine Yaourt Nature - Danone - 1,5 kg e (12 * 125 g),254.07,845922,4,11-4-2016,39,955891.86,yaourt
Activia Noix de Coco - Danone - 650 g,256.19,497515,1,1-14-2016,25,538974.583333333,yaourt
Danio Raspberry - danone - 160g,312.67,581307,5,3-17-2017,17,614247.73,yaourt
Fjørd nature - Danone - 500 g, 4 pots de 125 g,425.59,878459,4,8-26-2016,26,954592.113333333,yaourt
this the config file
input {
file {
path => "/opt/logstash-5.5.0/data/es_data_base.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
#product-name,price,number_of_customers,general_feddback,promotion_period,percentage_promotion,number_of_customers_after_promotion,categorie_produit
columns => ["product-name","price","number_of_customers","general_feddback","promotion_period","percentage_promotion","number_of_customers_after_promotion","categorie_produit"]
}
}
output {
elasticsearch {
hosts => "http://197.12.8.3:9200"
index => "es_retails"
}
stdout {}
}
this warn appeared :
Received an event that has a different character encoding than you configured. {:text=>"Ketchup - Jardin Bio' - 560\xA0g,2784.03,18174,3,1-7-2016,26,3025.3126,salami_viande", :expected_charset=>"UTF-8"}
how to fix it !!! any help please