Input ElasticSearch output Redis using Logstash

I want to ship Data from Source Elasticsearch to Target ElasticSearch via Redis using Logstash ...

The problem is one field (named as eventTime) which is of type Date in Source ElasticSearch index is not getting created as Date (but as String) in Target ElasticSearch

I am using below config:

input {
# Read all documents from Elasticsearch matching the given query
elasticsearch {
hosts => "server.ip"
index => "source_index"
type => "generic-component"
query => '{ "query": { "filtered": { "query": { "match_all": {} }, "filter": { "range": { "eventTime": { "gte": "now-60s" } } } } }}'
}

}

filter {

mutate {

    add_field => { "log_environment" => "a" }
    add_field => { "host" => "some_ip" }

     } 

     date {
                            match => ["eventTime", "yyyy-MM-dd'T'hh:mm:ss.SSSZ", "ISO8601", "UNIX", "UNIX_MS", "TAI64N"]
                            #timezone => "Europe/Amsterdam"
                            target => "eventTime"

          }

}

output {

    redis {
        host => "Target_Redis_Buffer.ip"
        data_type => "list"
        key => "logstash"
        codec => json
  }

}

In the source eventTime is of this sample values:
2016-09-01T16:20:00.073+02:00
2016-09-06T13:34:05.327+0200

I suspect I am goofing up with the Date Mutation but not able figure out what.

The field was probably mapped as string at some point (when eventTime looked differently and couldn't be parsed as a timestamp) and now the mapping can't be changed without reindexing. What if you delete the destination index and try again (forcing a recreation) or leave the existing index alone and simply index into a new index?