message=>"Failed parsing date from field", :field=>"timestamp", :value=>"2016-09-14 15:43:48.258000", :exception=>"Invalid format: \"2016-09-14 15:43:48.258000\"",

I have this error,

{:timestamp=>"2016-09-14T16:43:48.937000+0100", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"2016-09-14 15:43:48.258000", :exception=>"Invalid format: "2016-09-14 15:43:48.258000"", :config_parsers=>"ISO8601,yyyy-MM-dd'T'HH:mm:ss.SSSZZ,yyyy-MM-dd HH:mm:ss,SSS,MMM dd YYYY HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}

my filter config"

grok {
add_tag => [ "valid" ]
match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{DATA} Processed (?:inbound|outbound) message for ([^\s]+): %{GREEDYDATA:json_data}" }

json {
  source => json_data

date {
  match => [ "timestamp","ISO8601","yyyy-MM-dd'T'HH:mm:ss.SSSZZ","yyyy-MM-dd HH:mm:ss,SSS","MMM dd YYYY HH:mm:ss" ]
  remove_field => ["timestamp"]
  target => "@timestamp"

Someone Help.

Since you have microsecond precision in your input I expect you need SSSSSS instead of just SSS.

Thanks for your response.

I tried adding the "SSSSSS" but the error returned this :
{:timestamp=>"2016-09-15T09:33:12.980000+0100", :message=>"Failed parsing date from field", :field=>"timestamp", :value=>"2016-09-15 08:33:05.813000", :exception=>"Invalid format: "2016-09-15 08:33:05.813000"", :config_parsers=>"ISO8601,yyyy-MM-dd'T'HH:mm:ss.SSSSSSZZ,yyyy-MM-dd HH:mm:ss,SSSSSS,MMM dd YYYY HH:mm:ss", :config_locale=>"default=en_US", :level=>:warn}

I noticed that when i changed the comma "yyyy-MM-dd HH:mm:ss,SSSSSS" to dot "yyyy-MM-dd HH:mm:ss.SSSSSS". Logstash started and does not returned the error but kibana stopped visualizing. Once i returned the comma kibana started working again and logstash started giving the error again.

Could this issue attach to elasticsearch mapping?
Elasticsearch mapping:
"@timestamp": {
"format": "yyyy-MM-dd'T'HH:mm:ss.SSSZ",
"index": "not_analyzed",
"type": "date"