Quick help required: Logstash 2.3.1 Grok Plugin Parse Error

Hi,

Your quick help on the following issue is greatly appreciated.

We are getting the following error, when Logstash is executing grok filter.
The grok configuration was perfectly working earlier.

S/W versions:

  1. Logstash 2.3.1 (logstash-all-plugins-2.3.1)
  2. kafka plugin for Logstash - logstash-input-kafka (2.0.6)
  3. Kafka - kafka_2.11-0.9.0.0
  4. elasticsearch-2.2.0

ERROR:
{:timestamp=>"2016-04-09T04:35:13.222000-0500", :message=>"JSON parse failure. Falling back to plain-text", :error=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: [B@4b6f6b88; line: 1, column: 6]>, :data=>"2016-04-09 04:35:13,218 {"Timestamp":"2016-04-09T04:35:13.216-05:00","CorrelationID":"c7bac074-2985-446c-a21e-6bb50358d1cc","TransactionID":"cd400826-065e-44b2-8e19-a623915cea2e","ApplicationID":"XXX","Hostname":"xxx.xxx.com","HostIP":"xxx.xx.xx.xxx","Service":"TrainService","Operation":"getEvent","ApplicationOrServicetype":"REST","Operationtype":"GET","OperationURL":"http://xxxx:8080/xxx/track/train/T1111","Code":"BARES","Subcode":"EventService-getEvent-1","Payload":"<?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\" standalone=\\\"yes\\\"?>T111112,12AEI123D9999999","CarID":"T1111","BaseURI":"http://xxxx:8080/xxx/track/"}", :level=>:error}

GROK script:

filter {
grok {
match => { "message" => '%{DATESTAMP:MessageDate}[, ]%{NUMBER:milliseconds} %{GREEDYDATA:MessageData}' }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
json {
source => "MessageData"
}
mutate {
gsub => [
# Mask all sensitive data. NOTE: backslash need to be escaped
"Payload", '"sensitiveInfo"[=:]"."', '"sensitiveInfo":"[REDACTED]"',
"MessageData", '\\"sensitiveInfo\\"[=:]\\".
\\"', '\"sensitiveInfo\":\"[REDACTED]\"',
"Payload", ".", "[REDACTED]",
"MessageData", ".
", "[REDACTED]"
]
}
}

Why are all the double quotes in your post escaped?

{:timestamp=>"2016-04-09T04:35:13.222000-0500", :message=>"JSON parse failure. Falling back to plain-text", ...

This indicates that you're using the json codec for your kafka input, but that's a bad idea since the payload obviously isn't pure JSON. Switch to the plain codec instead.

match => { "message" => '%{DATESTAMP:MessageDate}[, ]%{NUMBER:milliseconds} %{GREEDYDATA:MessageData}' }

If you look at the definition of DATESTAMP,

you'll see that it won't match your timestamp. Use TIMESTAMP_ISO8601 instead (and then you don't have to capture the milliseconds separately).

With those corrections I think it'll work.