Timestamp is not parsing: grok

I have flat file(jobs.log) which has below data
2016-05-25 17:52:11,467 [INFO ] [main] [FileName.java:50] - Application: abc Rule: 123 Status: SUCCESS

I want log time stamp to show in kibana, so I am trying to replace @timestamp with log date time. I am using below config file
input{
# Read from flat file
file{
path => "\mypath\jobs.log"
start_position => beginning
ignore_older => 0
}
}
filter{
# check for patterns via grok plugin
grok{
match => { "message" => "%{TIMESTAMP_ISO8601:formattedDate}* Application: %{DATA:applicationId} Rule: %{WORD:ruleName} Status: %{WORD:status}"}
}
#date{
#match => [ "formattedDate","YYYY-MM-dd HH:mm:ss,SSS", "ISO8601"]
#locale => "en"
#target => "@timestamp"
#}
}
output{
elasticsearch {}
stdout { codec => rubydebug}
}
On my console formattedDate is not printing. Output is:
{
"message" => "2016-05-25 17:52:11,467 [INFO ] [main] [FileName.java:50] - Application: abc Rule: 123 Status: SUCCESS",
"@version" => "1",
"@timestamp" => "2016-05-27T18:28:04.559Z",
"path" => "\mypath\jobs.log",
"host" => "hostName",
"applicationId" => "abc",
"ruleName" => "123",
"status" => "SUCCESS"
}

How can I get timestamp replaced with log file timestamp?

Seems its not working. Its giving me same o/p.
{
"message" => "2016-05-25 17:52:11,467 [INFO ] [main] [FileName.java:50] - Application: abc Rule: 123 Status: SUCCESS",
"@version" => "1",
"@timestamp" => "2016-05-27T20:46:07.723Z",
"path" => "\jobs.log",
"host" => "hostName",
"applicationId" => "abc",
"ruleName" => "123",
"status" => "SUCCESS"
}

Below screenshot from kibana

You seem to think that * is a general wildcard character, but it actually means "zero or more occurrences of the previous token", so you're effectively making the timestamp optional. You can get the desired effect by replacing * with .*.

Thanks!! It resolved my problem. I got one more question, I am able to get formatted date now but time is different by couple of hours. I am assuming this is problem with timezone. Can you please help me how to set it?
Original timezone in logs is GMT format and my local system is set to UTC-7:00

O/P:
"message" => "2016-05-25 17:52:11,467 [INFO ] [main]
"@version" => "1",
"@timestamp" => "2016-05-26T00:52:11.467Z",
"path" => "C:\Elastic\logs\jobs.log",
"host" => "PHXDCJMP72",
"formattedDate" => "2016-05-25 17:52:11,467",

filter:
grok{
match => { "message" => "%{TIMESTAMP_ISO8601:formattedDate}.* Application: %{DATA:applicationId} Rule: %{WORD:ruleName} Status: %{WORD:status}"}
}
date{
match => ["formattedDate", "YYYY-MM-dd HH:mm:ss,SSS","ISO8601"]
target=>"@timestamp"
}

If the log's timezone doesn't match the timezone of the machine running Logstash you can use the date filter's timezone option to override it.

Working as expected now. Thank you for your help!!