- Beat version
- 6.4.0
- Operating System
- Windows
Maybe what I'm trying to do is unsupported (the answer in Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} would indicate that's the case), but looking at the FileBeats documentation (Stdin input | Filebeat Reference [6.4] | Elastic), it seems like I should be able to do this based on the sentence
The decoding happens before line filtering and multiline. You can combine JSON decoding with filtering and multiline if you set the "message_key" option.
To be fair, the line immediately above the statement I mentioned says These options make it possible for Filebeat to decode logs structured as JSON messages. Filebeat processes the logs line by line, so the JSON decoding only works if there is one JSON object per line.
- so there's a good chance I'm just misunderstanding the documentation. To me, those two statements seem to conflict with one another.
I'm trying to parse logs that look like the following:
{
"messageKey": "Hello!",
"blahKey": "blahValue",
"kregsKey": "Hello!",
"event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81",
"level": "error",
"logger": "android-exception"
}
And I'm getting the following error message:
2018-09-06T13:25:55.797-0600 DEBUG [publish] pipeline/processor.go:308 Publish event: {
"@timestamp": "2018-09-06T19:25:50.792Z",
"@metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.4.0"
},
"offset": 3508,
"json": {
"error": {
"type": "json",
"message": "Error decoding JSON: json: cannot unmarshal string into Go value of type >map[string]interface {}"
},
"messageKey": "{\n\t"messageKey": "Hello!",\n\t"blahKey": "blahValue",\n\t"kregsKey": >"Hello!",\n\t"event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81",\n\t"level": >"error",\n\t"logger": "android-exception""
},
"input": {
"type": "log"
},
"prospector": {
"type": "log"
},
"beat": {
"hostname": "LYNCHC18",
"version": "6.4.0",
"name": "LYNCHC18"
},
"host": {
"name": "LYNCHC18"
},
"source": "G:\kregsTestLog2.log"
}
If I condense the json message into one line, the json is parsed and sent correctly. I'm hoping to avoid the requirement to condense all logs into one line for readability purposes, as my logs are ingested by multiple people, not all of which use Kibana.
Am I misunderstanding the documentation, and filebeat doesn't actually support multiline json parsing? Here's my current configuration:
filebeat.inputs:
- type: log
enabled: true
paths:
- G:\kregsTestLog2.log
json.message_key: messageKey
json.keys_under_root: false
json.add_error_key: true
multiline.pattern: '^{'
multiline.negate: true
multiline.match: afterfilebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: falsesetup.template.settings:
index.number_of_shards: 3setup.kibana:
output.elasticsearch:
hosts: ["search-craigs-test-elasticsearch-uhvnj6zt5s3px2rldniycjslga.us-east-> 2.es.amazonaws.com:443"]
protocol: "https"
Thanks so much in advance for your help!