Sorry for missing the conversation.
My csv file is this
Time,SPA,SPB
Thu Mar 12 16:10:25 EDT 2015,41.30672072,43.15346226
Thu Mar 12 16:20:55 EDT 2015,40.14174087,43.43324251
Thu Mar 12 16:31:03 EDT 2015,41.03425118,41.57152451
Thu Mar 12 16:41:03 EDT 2015,41.60066007,40.12864918
Thu Mar 12 16:51:03 EDT 2015,40.94268353,39.95073892
Thu Mar 12 17:01:03 EDT 2015,42.63400197,36.82393556
Thu Mar 12 17:11:18 EDT 2015,43.05921053,39.68384653
Thu Mar 12 17:21:34 EDT 2015,42.19471642,38.6971766
Thu Mar 12 17:31:34 EDT 2015,38.66909452,42.31278993
Thu Mar 12 17:41:34 EDT 2015,38.25136612,39.92710404
Thu Mar 12 17:51:34 EDT 2015,38.63862206,37.51243781
Thu Mar 12 18:01:34 EDT 2015,40.45046695,37.14704804
and my logstash configuration is
input {
file {
path => ["/home/user/sputils.csv"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["Time","SPA","SPB"]
separator => ","
}
date
{
match => [Time,"%{EEE MMM dd HH:mm:ss zzz YYYY}"]
}
mutate {convert =>["SPA", float] }
mutate {convert =>["SPB", float] }
}
output {
elasticsearch {
# action => "index"
# host => "localhost"
# index => "sutil"
# workers => 1
# }
stdout { codec => rubydebug}
}
I get,
"message" => [
[0] "Mon Mar 16 01:12:00 EDT 2015,56.9680939,57.83989414"
],
"@version" => "1",
"@timestamp" => "2015-07-24T14:10:58.529Z",
"host" => "0.0.0.0",
"path" => "/home/user/sputils.csv",
"Time" => "Mon Mar 16 01:12:00 EDT 2015",
"SPA" => 56.9680939,
"SPB" => 57.83989414
However "Time" is not the time type. it is getting generated as String.
I went through this,
But couldn't do much.
Regards