Date format match

thanks for answering , still with this error , this is my conf

input {

beats {
# The port to listen on for filebeat connections.
port => 5044
# The IP address to listen for filebeat connections.
host => "0.0.0.0"
}
}
filter {
grok {
match => { "message" =>"[%{TIMESTAMP_ISO8601:date_log}] | [%{WORD:ENV}] | [%{WORD:APPLICATION}] | [%{WORD:TYPE}] | [%{GREEDYDATA:LOGMS}]" }

}

output {
elasticsearch {
hosts => ["11.224.212:9200"]
manage_template => false
index => "docker-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

output {
file
{
path=> "/tmp/file.txt"

codec => line { format => "custom format: %{message}"}
}
}

this is the output of the debug:
ustom format: [2018-06-08 18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]
custom format: [2018-06-08 18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]
custom format: [2018-06-08T18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]

the last log works , the previous doesn't , just the Date field is the problem:

2018-06-11T10:09:36,671][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"docker-2018.06.11", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x1a1a2461], :response=>{"index"=>{"_index"=>"docker-2018.06.11", "_type"=>"doc", "_id"=>"YcT37mMBWr8dcirVVUUY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date_log]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-06-08 18:19:00" is malformed at " 18:19:00""}}}}}

in http://grokconstructor.appspot.com/do/match#result , the filter works: