Dateparsefailure - Error in Date value

Hello everyone,
I'm trying a few days to enter the csv file data, but I have not been able to take the lead in this error in the date field.

This message:
←[33mFailed parsing date from field {:field=>"Datahora", :value=>"2016-05-09 13: 57:49,B82528,265,\"-22.858118\",\"-43.371143\",0.19", :exception=>"Invalid forma t: \"2016-05-09 13:57:49,B82528,265,\"-22.858118\",\"-43.37...\" is malformed at \",B82528,265,\"-22.858118\",\"-43.37...\"", :config_parsers=>"YYYY-MM-dd HH:mm :ss.SSS", :config_locale=>"default=pt_PT", :level=>:warn}←[0m ←[33mFailed parsing date from field {:field=>"Datahora", :value=>"2016-05-09 13: 59:36,D86095,867,\"-22.931969\",\"-43.574604\",3.52", :exception=>"Invalid forma t: \"2016-05-09 13:59:36,D86095,867,\"-22.931969\",\"-43.57...\" is malformed at \",D86095,867,\"-22.931969\",\"-43.57...\"", :config_parsers=>"YYYY-MM-dd HH:mm :ss.SSS", :config_locale=>"default=pt_PT", :level=>:warn}←[0m{ "message" => "\"2016-05-09 11:28:32,B19524,665,\"\"-22.809641\"\",\"\"-43 .343151\"\",0\"\r", "@version" => "1", "@timestamp" => "2016-06-09T11:54:37.819Z", "path" => "c:/tmp/onibus.csv", "host" => "pc-marcio", "Datahora" => "2016-05-09 11:28:32,B19524,665,\"-22.809641\",\"-43.343151\ ",0", "tags" => [ [0] "_dateparsefailure" ] }

My CSV file:
Datahora,Ordem,Linha,Latitude,Longitude,Velocidade 2016-05-09 00:05:42,D53517,,"-22.883249","-43.495152",0 2016-05-09 00:20:08,B10547,,"-22.86828","-43.25724",0 2016-05-09 00:28:39,C41386,,"-22.8741","-43.241573",0.56 2016-05-09 00:30:29,D53568,,"-22.883801","-43.494732",0 2016-05-09 00:36:32,B58075,621,"-22.837311","-43.28672",27

My logstash_myapp.conf

input {
file {
path => "c:/tmp/onibus.csv"
start_position => "beginning"
filter {
if [message] =~ /^[a-z]*,/ {
drop { }
csv {
separator => ","
columns => ["Datahora","Ordem","Linha","Latitude","Longitude","Velocidade"]
date {
match => ["Datahora", "YYYY-MM-dd HH:mm:ss.SSS"]
remove_field => ["Datahora"]
mutate {convert => ["Velocidade", "float"]}
output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "myapp"
workers => 1
stdout { codec => rubydebug}

What is wrong?

"message" => ""2016-05-09 11:28:32,B19524,665,""-22.809641"",""-43.343151"",0"\r",

Judging by this your input line actually looks like this (or that's at least what Logstash thinks):

"2016-05-09 11:28:32,B19524,665,""-22.809641"",""-43.343151"",0"

The initial double quote makes Logstash look for a closing quote before it considers the first column "done". Please check your input data once more.

Strange, my data seem to be correct. The quotes make a difference in the fields?

Strange, my data seem to be correct.

Simplify your configuration. Start with the very simplest one that doesn't do anything but read the file and dump it to a stdout { codec => rubydebug } output. Does that look okay? Start adding things back so your configuration converges towards your current broken configuration. Observe what happens.

The quotes make a difference in the fields?

Yes. Double quotes allows a field value to contain commas that otherwise would indicate the beginning of the next column.