I'm trying to import data from a csv into an ES index using logstash. In the CSV I have seperate date and time columns that I'm combining and then parsing it with the date filter plugin so it can use it as the @timestamp. There are 150500 records and all of pass and are correctly matched except for 1 record. Reviewing this record there is nothing obviously abnormal about it that would necessarily cause such an issue so I'm at a loss. I've tried deleting the index and rerunning logstash multiple times and each time the same record fails. The record is tagged with the _dateparsefailure tag and its @timestamp is the only one containing the upload time instead of the parsed.
I'm new to logstash so there's probably a better way to do this but I have a field called date that contains a "Date" like so "MM/dd/yyyy 12:00:00 AM" (yes every record is 12am) and a "Time" field like so "HH:mm". I pass the following filters:
truncate {
fields => "Date"
length_bytes => 10
}
mutate {
add_field => { "DateTime" => "%{Date} %{Time}"}
remove_field => [ "Date", "Time" ]
}
date {
match => [ "DateTime", "MM/dd/yyyy HH:mm" ]
timezone => "America/Los_Angeles"
remove_field => [ "DateTime" ]
}
The record in question contains Date and Time like so 03/13/2016 12:00:00 AM,02:30.
I should also point out that I seem to get no error if I also output to stdout, it just shows the tags with _dateparsefailure.