Creating logstash config file for csv with multiple date fields

I have a csv file (UTF-8), which I have stored as .txt. It has multiple date fields. I want to view the date field types in Kibana as well as 'dates'. However, I am always getting "Error parsing csv" message. Pls help.
Note that the separator used is 'tab'
Below is the config file
input {
file {
path => "/etc/logstash/du/*.txt"
type => "du"
start_position => "beginning"
sincedb_path => "/home/ec2-user/dusincedbfile"
filter {
csv {
separator => " "
columns => ["Employee ID","ReportDate","Employee Name","Designation","Grade","Baseline Start Dt","Baseline End Dt","Date Of Joining","Rate"]
date {
match => ["ReportDate","M/d/yyyy"]
target => "ReportDate"
date {
match => [ "Baseline Start Dt", "M/d/yyyy" ]
target => "Baseline Start Dt"
date {
match => [ "Baseline End Dt", "M/d/yyyy" ]
target => "Baseline End Dt"
date {
match => ["Date Of Joining", "M/d/yyyy" ]
target => "Date Of Joining"
mutate {convert => ["Rate","float"]}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "dus"
workers => 1
stdout {}

Below is a cut-paste of the error:

{:timestamp=>"2016-04-25T11:58:29.717000+0000", :message=>"Pipeline main started"}
{:timestamp=>"2016-04-25T11:58:29.889000+0000", :message=>"Error parsing csv", :field=>"message", :source=

Without knowing what the input file looks like it's hard to help.

Yes, actually you are right! Further analysing the input file, I found that tab separator had mistakenly got replaced with spaces. However, now I am getting a different error in Elastic Search, after fixing the tab issue. It is as below:

MapperParsingException[failed to parse [ReportDate]]; nested: IllegalArgumentException[Invalid format: "ReportDate"];
        at org.elasticsearch.index.mapper.FieldMapper.parse(
        at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(
        at org.elasticsearch.index.mapper.DocumentParser.parseValue(
        at org.elasticsearch.index.mapper.DocumentParser.parseObject(
        at org.elasticsearch.index.mapper.DocumentParser.parseDocument(
        at org.elasticsearch.index.mapper.DocumentMapper.parse(
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(
        at org.elasticsearch.index.shard.IndexShard.prepareCreateOnPrimary(
        at org.elasticsearch.action.index.TransportIndexAction.prepareIndexOperationOnPrimary(
        at org.elasticsearch.action.index.TransportIndexAction.executeIndexRequestOnPrimary(
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(
        at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(
        at org.elasticsearch.transport.TransportService$4.doRun(
        at java.util.concurrent.ThreadPoolExecutor.runWorker(
        at java.util.concurrent.ThreadPoolExecutor$
Caused by: java.lang.IllegalArgumentException: Invalid format: "ReportDate"
        at org.joda.time.format.DateTimeParserBucket.doParseMillis(
        at org.joda.time.format.DateTimeFormatter.parseMillis(
        at org.elasticsearch.index.mapper.core.DateFieldMapper$DateFieldType.parseStringValue(
        at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(
        at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(
        at org.elasticsearch.index.mapper.FieldMapper.parse(
        ... 22 more

Also here is a sample input file:
Employee ID ReportDate Baseline Start Dt Baseline End Dt Rate
680732 4/12/2016 12/1/2013 12/31/2015 10
679187 4/12/2016 12/1/2013 12/31/2015 0

However, despite teh error, the data is getting loaded

And how is the ReportDate field mapped in ES?

date {
match => ["ReportDate","M/d/yyyy"] 
target => "ReportDate" 

Above is how I use it in logstash. In Kibana, I then selected the index 'dus' and the timestamp field I chose as 'ReportDate'

No, what's the field's mapping in the Elasticsearch index?

$ curl -XGET 'http://localhost:9200/_mapping/dus'

Here is what I got when I gave the above command:
{"error":{"root_cause":[{"type":"type_missing_exception","reason":"type[[dus]] missing","index":"_all"}],"type":"type_missing_exception","reason":"type[[dus]] missing","index":"_all"},"status":404}

So do you actually have a "dus" type in any of your indexes?

Yes, it is getting created. All teh values from csv (saved as .txt file ) are also getting loaded. Yet teh error!!