Mixed log with timestamp and json data

I also have a mixed json data with the timestamp first which i followed with the following code:

   grok{
      match => [ "message", "%{DATESTAMP} %{GREEDYDATA:message}" ]
      overwrite => ["message"]
   }
   json{
      source => "message"
      remove_field => ["message"]
   }
}

where %{DATESTAMP} is the timestamp at the beginning of the message and %{GREEDYDATA:message} is the json data

Results is partially working where the timestamp removed but json cannot be parsed:
```"message": "{\\\"id\\\":\\\"55d09c10f2aee2000133673a\\\",\\\"fingerprintjs_id\\\":\\\"\\\",\\\"email\\\":\\\"\\\",\\\"app_key\\\":\\\"ZSp-0vi8_0yRDY66bW--dg\\\",\\\"referrer\\\":\\\"http://www.domain.com/\\\",\\\"client_timestamp\\\":\\\"2015-08-16T14:20:04.709Z\\\",\\\"page_view_client_key\\\":\\\"b9d235da2011439734804706\\\",\\\"user_agent\\\":\\\"Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.155 Safari/537.36\\\",\\\"keywords\\\":\\\"\\\",\\\"description\\\":\\\"Police, military and search teams would check the area in Oksibil district where there had been reports of the crash.\\\",\\\"title\\\":\\\"Missing Indonesian plane with 54 crashed, report residents - Khaleej Times\\\",\\\"uid\\\":\\\"55d09c10f2aee2000133673b\\\",\\\"session_id\\\":\\\"55d09c10f2aee2000133673c\\\",\\\"url\\\":\\\"http://www.domain.com/international/rest-of-asia/missing-indonesian-plane-with-54-crashed-report-residents\\\",\\\"ip\\\":\\\"83.110.196.21\\\",\\\"received_at\\\":\\\"2015-08-16T14:20:00.802Z\\\"}\\n\",\"stream\":\"stdout\",\"time\":\"2015-08-16T14:20:00.802862057Z\"}",```

but it still shows:
`[0] "_jsonparsefailure"`

I think it is not normal with so many escape \\\ characters in json? (Note: I am not using json codec but json filter only with grok)

Try adding a mutate filter before the json filter and use the gsub parameter to strip out those backslashes from the JSON. Something like:

mutate {
  gsub => [ "message", "[\\]", "" ]
}