Hi,
I have deployed filebeat 5.4.0 on a windows machine and have visualcron output logs in a JSON format, now I got these logs pushed using filebeat json format:
VisualCron log example:
{ "timestamp":"2018-07-24 21:58:00", "JobName":"SOMETHING", "JobId":"666666", "JobRunId":"6666666", "Action": "Start" }, { "timestamp":"2018-07-24 21:58:00", "JobName":"SOMETHING", "JobId":"6666666", "JobRunId":"6666666", "Action": "End", "ExitCode":"0", "Status": "Running", "Result":"Success", "LastRun":"2018-07-24 23:58:00", "LastRunUTC:"2018-07-24 21:58:00", "ExecutionTime":"00:00:00" },
and my filebeat config:
filebeat.prospectors:
- input_type: log
document_type: visualcron-json
paths:
- D:\logs\visualcron\*.json
json.overwrite_keys: true
json.keys_under_root: true
json.message_key: message
json.add_error_key: true
#json.ignore_decoding_error: true
multiline.pattern: '^{'
multiline.negate: true
multiline.match: after
processors:
- decode_json_fields:
fields: ['message']
target: json
output.logstash:
hosts: ["x.x.x.x:x"]
then logsatsh havin a simple filter:
filter { if [type] == "visualcron-json" { json { source => "message" } } }
now the data get's into elasticsearch and then visualized in Kibana but I get following error:
json_error: Error decoding JSON: invalid character '}' looking for beginning of value
and then the field message which is not parsed and but all the data in one field:
message {
"timestamp":"2018-07-24 21:58:00",
"JobName":"SOMETHING",
"JobId":"666666",
"JobRunId":"6666666",
"Action": "Start"
},
Any insights how to parse these JSON logs properly?