Hi,
Your quick help on the following issue is greatly appreciated.
We are getting the following error, when Logstash is executing grok filter.
The grok configuration was perfectly working earlier.
S/W versions:
- Logstash 2.3.1 (logstash-all-plugins-2.3.1)
- kafka plugin for Logstash - logstash-input-kafka (2.0.6)
- Kafka - kafka_2.11-0.9.0.0
- elasticsearch-2.2.0
ERROR:
{:timestamp=>"2016-04-09T04:35:13.222000-0500", :message=>"JSON parse failure. Falling back to plain-text", :error=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: [B@4b6f6b88; line: 1, column: 6]>, :data=>"2016-04-09 04:35:13,218 {"Timestamp":"2016-04-09T04:35:13.216-05:00","CorrelationID":"c7bac074-2985-446c-a21e-6bb50358d1cc","TransactionID":"cd400826-065e-44b2-8e19-a623915cea2e","ApplicationID":"XXX","Hostname":"xxx.xxx.com","HostIP":"xxx.xx.xx.xxx","Service":"TrainService","Operation":"getEvent","ApplicationOrServicetype":"REST","Operationtype":"GET","OperationURL":"http://xxxx:8080/xxx/track/train/T1111","Code":"BARES","Subcode":"EventService-getEvent-1","Payload":"<?xml version=\\\"1.0\\\" encoding=\\\"UTF-8\\\" standalone=\\\"yes\\\"?>T111112,12AEI123D9999999","CarID":"T1111","BaseURI":"http://xxxx:8080/xxx/track/"}", :level=>:error}
GROK script:
filter {
grok {
match => { "message" => '%{DATESTAMP:MessageDate}[, ]%{NUMBER:milliseconds} %{GREEDYDATA:MessageData}' }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
json {
source => "MessageData"
}
mutate {
gsub => [
# Mask all sensitive data. NOTE: backslash need to be escaped
"Payload", '"sensitiveInfo"[=:]"."', '"sensitiveInfo":"[REDACTED]"',
"MessageData", '\\"sensitiveInfo\\"[=:]\\".\\"', '\"sensitiveInfo\":\"[REDACTED]\"',
"Payload", ".", "[REDACTED]",
"MessageData", ".", "[REDACTED]"
]
}
}