Logstash json parse failure when json contains a "message" field

Hello,
I'm using the json filter to parse the json formatted logfile. The json includes a message field.
The (formatted) record looks like this:
{
"transactionId": "1559728771468",
"hostName": "AUE2RHEESSVMSSP0100000S",
"loglevel": "INFO",
"logType": "APP",
"message": "Inside DAO : getData()METHOD-ENDS",
"dateTime": "06/05/2019 09:59:34.800 UTC"
}
In the logstash log I'm getting these errors:
[2019-06-05T00:00:27,705][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"Inside DAO : getData()METHOD-ENDS", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Inside': was expecting 'null', 'true', 'false' or NaN
at [Source: (byte)"Inside DAO : getData()METHOD-ENDS"; line: 1, column: 8]>}

The logstash config looks like this:
input {
file {
path => "/logs/appLog.log"
start_position => "beginning"
type => "applog"
}
}

filter {

if [path] =~ //logs/appLog.log/ {
json {
source => "message"
}
# This filter builds @timestamp
# 10/10/2018 21:10:27.837 UTC
# 10/10/2018 21:10:27,837 UTC
# 2018-10-18T08:00:35.629+00:00
# 2018-10-18T08:00:35,629+00:00
date {
match => [ "dateTime" ,
"ISO8601",
"MM/dd/yyyy HH:mm:ss.SSS ZZZ",
"MM/dd/yyyy HH:mm:ss,SSS ZZZ",
"yyyy-MM-dd'T'HH:mm:ss.SSSZZ",
"yyyy-MM-dd'T'HH:mm:ss,SSSZZ"
]

}

}
}
...

It looks like logstash is parsing the whole json and then again the "message" field. The record is tagged with _jsonparsefailure. I don't want to have this field parsed. Its plain text.
The json filter gets the "message" field as source, but what I mean is the whole record, not the field inside of it.

We are sending it to GrayLog where we can see the parsed json. Nothing strange, only the tag _jsonparsefailure.

I want to get rid of these messages because every log record is logged. The whole log gets cluttered and the logstash logs are getting huge pretty quick

logstash version: 6.4.2, 6.4.3
OS: RHEL 7.5

Do you have a second copy of the configuration file in path.config? That would result in it applying the json filter twice.

Thank you for your quick reply.
Indeed there was a backup of an old config file named like config.conf.bak
I thought logstash reads only files *.conf from /etc/logstash/conf.d
Can I configure to read only files ending with .conf ?

/etc/logstash/pipelines.yml:

  • pipeline.id: main
    path.config: "/etc/logstash/conf.d/*.conf"

Is this path.config filter not applied?

I would expect it to be applied. You could run with "--log.level debug --config.debug" to see what it is reading.

Found this message in the logfile when logstash starts:
[2019-06-06T07:49:32,095][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified

Note: Logstash was installed from an official rpm.

/etc/systemd/system/logstash.service:
[Unit]
Description=logstash

[Service]
Type=simple
User=logstash
Group=logstash
# Load env vars from /etc/default/ and /etc/sysconfig/ if they exist.
# Prefixing the path with '-' makes it try to load, but if the file doesn't
# exist, it continues onward.
EnvironmentFile=-/etc/default/logstash
EnvironmentFile=-/etc/sysconfig/logstash
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash"
Restart=always
WorkingDirectory=/
Nice=19
LimitNOFILE=16384

[Install]
WantedBy=multi-user.target

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.