Hi everyone,
i use the logstash file below to send data to elasticsearch.
input{
file{
path => "C:/Users/SOUMAYA/Desktop/bureau.csv"
start_position => "beginning"
sincedb_path => "NUL"
codec => plain { charset => "CP1252" }
}
}
filter{
csv {
separator => ";"
convert => {
"longitude" => "float"
"latitude" => "float"
}
}
date { match => [ "time", "dd MMM yy HH:mm:ss" ] }
mutate{ add_field => { "location" => "%{latitude},%{longitude}" } }
}
output{
elasticsearch {
action => "index"
hosts => ["http://localhost:9200/"]
index => "bureau"
}
stdout { codec => rubydebug }
}
The problem is that the data are saved in a messy way and the field's labels are saved as column1 ,column2 instead of the field's real labels.
This is a capture from elasticsearch .
Also, here is capture of the data file(in format xlsx just to have a clear view) ).
Could you please help me.