Convert string to date in logstash config file

hi... my csv file is:-
input {
stdin {}
}
filter {
csv {
separator => " "
columns => ["frame.time","col.Source","col.Destination","col.Protocol","frame.len"]
}
mutate {rename => {"frame.time" => "Time"}}
mutate {rename => {"col.Source" => "Source" }}
mutate {rename => {"col.Destination" => "Destination"}}
mutate {rename => {"col.Protocol" => "Protocol"}}
mutate {rename => {"frame.len" => "Length" }}
}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "d"
workers => 1
}

stdout{

codec =>rubydebug}

}
when i run this file ."Time" data type shown in kibana is string but i want to set the data type of "Time" field is date.
what should i do?

output when i run config file:-
"message" => "May 2, 2013 07:32:15.227019000\t115.165.168.105\t202.141.75.170\tTCP\t66",
"@version" => "1",
"@timestamp" => "2016-06-16T11:14:51.341Z",
"host" => "sab-VirtualBox",
"Time" => "May 2, 2013 07:32:15.227019000",
"Source" => "115.165.168.105",
"Destination" => "202.141.75.170",
"Protocol" => "TCP",
"Length" => "66",
"tags" => [
[0] "_dateparsefailure"
]

i want timestamp take my Time field value or time field should be date type.

I suggest you use a date filter to parse the contents of the Time field and turn it into something that ES can recognize as a timestamp.

Thanks for Help.
it works.