Hi,
My issue:
- When I pass csv files from filebeat with different header, csv output does not match the right column with the right value.
Does someone know to manage that, forget the previous csv header read by log stash and coming from filebeat ?
Example :
file1.csv :
c1, c2, c3
1, 2, 3
output: c1=1, c2=2, c3=3
file2.csv:
c1, c2, c3, c4
1,2,3,4
output: c2=1, c3=2, c4=3
I set up my logstash config as below:
indent preformatted text by 4 spaces
input{
beats{
port => "5044"
}
filter{
if ([fields][log_type]=="bucking"){
csv{
separator => ","
autodetect_column_names => true
autogenerate_column_names => true
skip_header => false
skip_empty_columns => false
skip_empty_rows => false
}
mutate{
convert => {
"Output_id" => "integer"
"Diameter" => "float"
"Price" => "float"
"TotalValue" => "float"
"Volume" => "float"
"NominalVolume" => "float"
"RealVolume" => "float"
"SawdustVolume" => "float"
"NbSol" => "integer"
"NumShapePLC" => "integer"
"TimeDisp" => "float"
"TimeOpti" =>"float"
"TimeWait" => "float"
"TimeSend" => "float"
"TimeOptiMin" =>"float"
"TimeOptiMax" => "float"
"TimeOptiAverage" => "float"
"TimeOptiTotal" => "float"
"EtatSolution" => "integer"
"ValeurReelle" => "float"
"Version_TVL" => "string"
}
}
mutate {
copy => {
"[fields][log_type]" => "bucking"
}
}
prune{
whitelist_names => ["Output_id","NumShapePLC","Version_TVL"]
}
}
}
output{
elasticsearch{
hosts => "localhost:9200"
index =>"%{[fields][log_type]}"}
stdout{}
}