Creating logstash config file for csv with multiple date fields

I have a csv file (UTF-8), which I have stored as .txt. It has multiple date fields. I want to view the date field types in Kibana as well as 'dates'. However, I am always getting "Error parsing csv" message. Pls help.
Note that the separator used is 'tab'
Below is the config file
input {
file {
path => "/etc/logstash/du/*.txt"
type => "du"
start_position => "beginning"
sincedb_path => "/home/ec2-user/dusincedbfile"
}
}
filter {
csv {
separator => " "
columns => ["Employee ID","ReportDate","Employee Name","Designation","Grade","Baseline Start Dt","Baseline End Dt","Date Of Joining","Rate"]
}
date {
match => ["ReportDate","M/d/yyyy"]
target => "ReportDate"
}
date {
match => [ "Baseline Start Dt", "M/d/yyyy" ]
target => "Baseline Start Dt"
}
date {
match => [ "Baseline End Dt", "M/d/yyyy" ]
target => "Baseline End Dt"
}
date {
match => ["Date Of Joining", "M/d/yyyy" ]
target => "Date Of Joining"
}
mutate {convert => ["Rate","float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "dus"
workers => 1
}
stdout {}
}

Below is a cut-paste of the error:

{:timestamp=>"2016-04-25T11:58:29.717000+0000", :message=>"Pipeline main started"}
{:timestamp=>"2016-04-25T11:58:29.889000+0000", :message=>"Error parsing csv", :field=>"message", :source=

Without knowing what the input file looks like it's hard to help.

Yes, actually you are right! Further analysing the input file, I found that tab separator had mistakenly got replaced with spaces. However, now I am getting a different error in Elastic Search, after fixing the tab issue. It is as below:

MapperParsingException[failed to parse [ReportDate]]; nested: IllegalArgumentException[Invalid format: "ReportDate"];
        at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:329)
        at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:309)
        at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:436)
        at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:262)
        at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:122)
        at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:309)
        at org.elasticsearch.index.shard.IndexShard.prepareCreate(IndexShard.java:529)
        at org.elasticsearch.index.shard.IndexShard.prepareCreateOnPrimary(IndexShard.java:506)
        at org.elasticsearch.action.index.TransportIndexAction.prepareIndexOperationOnPrimary(TransportIndexAction.java:215)
        at org.elasticsearch.action.index.TransportIndexAction.executeIndexRequestOnPrimary(TransportIndexAction.java:224)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardIndexOperation(TransportShardBulkAction.java:326)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:119)
        at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:68)
        at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase.doRun(TransportReplicationAction.java:639)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
        at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:279)
        at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryOperationTransportHandler.messageReceived(TransportReplicationAction.java:271)
        at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:75)
        at org.elasticsearch.transport.TransportService$4.doRun(TransportService.java:376)
        at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: Invalid format: "ReportDate"
        at org.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187)
        at org.joda.time.format.DateTimeFormatter.parseMillis(DateTimeFormatter.java:780)
        at org.elasticsearch.index.mapper.core.DateFieldMapper$DateFieldType.parseStringValue(DateFieldMapper.java:362)
        at org.elasticsearch.index.mapper.core.DateFieldMapper.innerParseCreateField(DateFieldMapper.java:528)
        at org.elasticsearch.index.mapper.core.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:241)
        at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:321)
        ... 22 more

Also here is a sample input file:
Employee ID ReportDate Baseline Start Dt Baseline End Dt Rate
680732 4/12/2016 12/1/2013 12/31/2015 10
679187 4/12/2016 12/1/2013 12/31/2015 0

However, despite teh error, the data is getting loaded

And how is the ReportDate field mapped in ES?

date {
match => ["ReportDate","M/d/yyyy"] 
target => "ReportDate" 
}

Above is how I use it in logstash. In Kibana, I then selected the index 'dus' and the timestamp field I chose as 'ReportDate'

No, what's the field's mapping in the Elasticsearch index?

https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-get-mapping.html

$ curl -XGET 'http://localhost:9200/_mapping/dus'

Here is what I got when I gave the above command:
{"error":{"root_cause":[{"type":"type_missing_exception","reason":"type[[dus]] missing","index":"_all"}],"type":"type_missing_exception","reason":"type[[dus]] missing","index":"_all"},"status":404}

So do you actually have a "dus" type in any of your indexes?

Yes, it is getting created. All teh values from csv (saved as .txt file ) are also getting loaded. Yet teh error!!