In short I am getting the following error when I view the console terminal on logstash
JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'new': was expecting 'null', 'true', 'false' or NaN at
My setup looks like this
Spring boot -> log-file.json* -> filebeat -> logstash -> elastic
json file encoded using logstash-logback-encoder
log-file.json
{"@timestamp":"2017-09-08T17:23:38.677+01:00","@version":1,"message":"new log content","logger_name":"log.tracing.controller.HelloAController","thread_name":"http-nio-8081-exec-2","level":"INFO","level_value":20000,"X-Span-Export":"false","X-B3-SpanId":"9c7519e0c71db8f7","X-B3-TraceId":"9c7519e0c71db8f7"}
filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- /mnt/log/*.log
json.overwrite_keys: true
json.keys_under_root: true
fields_under_root: true
output.logstash:
hosts: ['logstash:5044']
logstash.conf
input {
beats {
port => 5044
codec => "json"
}
}
output {
elasticsearch {
hosts => [ 'elasticsearch' ]
user => 'elastic'
password => 'changeme'
}
stdout { codec => rubydebug }
}
debug response
JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'new': was expecting 'null', 'true', 'false' or NaN at [Source: new log content; line: 1, column: 4]>, :data=>"new log content"
{
"X-Span-Export" => "false",
"offset" => 307,
"level" => "INFO",
"input_type" => "log",
"source" => "/mnt/log/tracing-A.log",
"message" => "new log content",
"type" => "log",
"tags" => [
[0] "_jsonparsefailure",
[1] "beats_input_codec_json_applied"
],
"@timestamp" => 2017-09-08T16:23:38.677Z,
"X-B3-SpanId" => "9c7519e0c71db8f7",
"level_value" => 20000,
"thread_name" => "http-nio-8081-exec-2",
"@version" => 1,
"beat" => {
"hostname" => "1ca3ba38c094",
"name" => "1ca3ba38c094",
"version" => "5.5.1"
},
"host" => "1ca3ba38c094",
"X-B3-TraceId" => "9c7519e0c71db8f7",
"logger_name" => "log.tracing.controller.HelloAController"
}
Further notes
- its actualy working. I get the logs parsed correctly and I am able to view them in kibana. The error message appears to be a false positive to me? Or Maybe I have not understood something here?
- I wish to use the timestamp from the original json log file thats why I have overwrite keys and keys under root selected.
- If rename "message" to "msg" json field on the log-file.json I no longer recieve the error message. However I would prefer to keep the default logstash encoder parmaters and not have to change it to msg.