Problem parsing Json from Beats

Hello I am new to the Elastic stack and am trying to get a fairly simple setup running:
logback > filebeats > logstash > elasticsearch > Kibana

I am using the logstash encoder in logback to create log entries as json files. Filebeats seems to send the json entries to logstash where they then run into a parse error:

[2017-07-28T16:53:28,537][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'My': was expecting ('true', 'false' or 'null')
at [Source: My Message; line: 1, column: 5]>, :data=>"My Message"}

It seems like it is trying to parse the message as a json instead of the entire json object containing the message.

Here are my configuration files:
filebeat:

filebeat:
prospectors:
- document_type: my_document
fields:
logical_hostname: my_host
app: my_app
json:
keys_under_root: true
add_error_key: true
message_key: message
paths:
- /path/to/log/log.json
output:
logstash:
hosts: ["127.0.0.1:5044"]

logstash:

input {
beats {
port => 5044
codec => json
}
}

output {
elasticsearch {
hosts => "127.0.0.1:9200"
}
}

And here an example object published by filebeats:

{
"@timestamp": "2017-07-31T11:08:10.299Z",
"@version": 1,
"beat": {
"hostname": "#######",
"name": "my-name",
"version": "5.2.1"
},
"fields": {
"app": "my-app",
"logical_hostname": "my-host"
},
"input_type": "log",
"level": "WARN",
"level_value": 30000,
"logger_name": "com.some.package.Class",
"message": "My Message",
"offset": 132282,
"source": "/path/to/log/log.json",
"thread_name": "ajp-nio-8609-exec-8",
"type": "my-type"
}

The json looks ok to me. The problem most likely is somehow connected to the logstash config? I have tried both codec json as well as json_lines. With codec json the log entries show up in kibana, yet they are tagged with _jsonparsefailure. And the logstash files shows above mentioned error.
Any help would be appreciated.

So after getting help in the logstash IRC channel it seems that having both the json element in the filebeats config as well as the codec => json in the logstash leads to the json being decoded twice and failing.
Removing the codec and defaulting to plain seems to have resolved this issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.