Hello,
I'm using the json filter to parse the json formatted logfile. The json includes a message field.
The (formatted) record looks like this:
{
"transactionId": "1559728771468",
"hostName": "AUE2RHEESSVMSSP0100000S",
"loglevel": "INFO",
"logType": "APP",
"message": "Inside DAO : getData()METHOD-ENDS",
"dateTime": "06/05/2019 09:59:34.800 UTC"
}
In the logstash log I'm getting these errors:
[2019-06-05T00:00:27,705][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"Inside DAO : getData()METHOD-ENDS", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Inside': was expecting 'null', 'true', 'false' or NaN
at [Source: (byte)"Inside DAO : getData()METHOD-ENDS"; line: 1, column: 8]>}
The logstash config looks like this:
input {
file {
path => "/logs/appLog.log"
start_position => "beginning"
type => "applog"
}
}
filter {
if [path] =~ //logs/appLog.log/ {
json {
source => "message"
}
# This filter builds @timestamp
# 10/10/2018 21:10:27.837 UTC
# 10/10/2018 21:10:27,837 UTC
# 2018-10-18T08:00:35.629+00:00
# 2018-10-18T08:00:35,629+00:00
date {
match => [ "dateTime" ,
"ISO8601",
"MM/dd/yyyy HH:mm:ss.SSS ZZZ",
"MM/dd/yyyy HH:mm:ss,SSS ZZZ",
"yyyy-MM-dd'T'HH:mm:ss.SSSZZ",
"yyyy-MM-dd'T'HH:mm:ss,SSSZZ"
]
}
}
}
...
It looks like logstash is parsing the whole json and then again the "message" field. The record is tagged with _jsonparsefailure. I don't want to have this field parsed. Its plain text.
The json filter gets the "message" field as source, but what I mean is the whole record, not the field inside of it.
We are sending it to GrayLog where we can see the parsed json. Nothing strange, only the tag _jsonparsefailure.
I want to get rid of these messages because every log record is logged. The whole log gets cluttered and the logstash logs are getting huge pretty quick
logstash version: 6.4.2, 6.4.3
OS: RHEL 7.5