Hey guys,
i'm trying to count the number of rows in my csv file (it contain 2631 row) and save it in a variable.
I used ruby filter but it didn't work so i tried using the metric filter but i get an result that i even don't undrestand.
Is there any way to count the number of rows using this filter or an other one?
Here is my code and the result that i get:
input{
file{
path => "C:/Users/SOUMAYA/Desktop/couverture_2g.csv"
start_position => "beginning"
sincedb_path => "NUL"
codec => plain { charset => "CP1252" }
}
}
filter{
csv {
separator => ";"
columns => [ "Message",
"Time",
"Distance",
"Longitude",
"Latitude",
"ServRxLevIdle"
]
convert => {
"Longitude" => "float"
"Latitude" => "float"
"ServRxLevIdle" => "float"
}
}
mutate { rename => ["ServRxLevIdle", Contrainte] }
mutate { add_field => { "Location" => ["%{[Latitude]}","%{[Longitude]}"] } }
mutate { convert=> ["Location", "float"] }
date { match => [ "time", "dd MMM yy HH:mm:ss" ] }
metrics {
meter => "%{@timestamp}_count"
add_tag => "metrics"
}
}
output{
elasticsearch {
action => "index"
hosts => ["http://localhost:9200/"]
index => "monica"
}
stdout { codec => rubydebug }
}
The console:
},
"2020-06-18T08:57:46.675Z_count" => {
"rate_15m" => 0.3659788914920122,
"rate_1m" => 0.10543885524629079,
"rate_5m" => 0.30637133534585953,
"count" => 2
},
"2020-06-18T08:57:53.676Z_count" => {
"rate_15m" => 4.940715035142166,
"rate_1m" => 1.4234245458249257,
"rate_5m" => 4.136013027169103,
"count" => 27
},
"2020-06-18T08:57:57.711Z_count" => {
"rate_15m" => 0.1850339994068403,
"rate_1m" => 0.06228064478291957,
"rate_5m" => 0.15837791326735637,
"count" => 1
},
"2020-06-18T08:57:56.533Z_count" => {
"rate_15m" => 0.5489683372380184,
"rate_1m" => 0.15815828286943623,
"rate_5m" => 0.4595570030187892,
"count" => 3
},
"2020-06-18T08:57:36.776Z_count" => {
"rate_15m" => 0.363951312806147,
"rate_1m" => 0.09700842985425956,
"rate_5m" => 0.30130746258186275,
"count" => 2
}
}
Could you please help me ?