Trouble parsing multi-line json logs

  • Beat version
    • 6.4.0
  • Operating System
    • Windows

Maybe what I'm trying to do is unsupported (the answer in Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {} would indicate that's the case), but looking at the FileBeats documentation (Stdin input | Filebeat Reference [6.4] | Elastic), it seems like I should be able to do this based on the sentence

The decoding happens before line filtering and multiline. You can combine JSON decoding with filtering and multiline if you set the "message_key" option.

To be fair, the line immediately above the statement I mentioned says These options make it possible for Filebeat to decode logs structured as JSON messages. Filebeat processes the logs line by line, so the JSON decoding only works if there is one JSON object per line. - so there's a good chance I'm just misunderstanding the documentation. To me, those two statements seem to conflict with one another.

I'm trying to parse logs that look like the following:

{
"messageKey": "Hello!",
"blahKey": "blahValue",
"kregsKey": "Hello!",
"event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81",
"level": "error",
"logger": "android-exception"
}

And I'm getting the following error message:

2018-09-06T13:25:55.797-0600 DEBUG [publish] pipeline/processor.go:308 Publish event: {
"@timestamp": "2018-09-06T19:25:50.792Z",
"@metadata": {
"beat": "filebeat",
"type": "doc",
"version": "6.4.0"
},
"offset": 3508,
"json": {
"error": {
"type": "json",
"message": "Error decoding JSON: json: cannot unmarshal string into Go value of type >map[string]interface {}"
},
"messageKey": "{\n\t"messageKey": "Hello!",\n\t"blahKey": "blahValue",\n\t"kregsKey": >"Hello!",\n\t"event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81",\n\t"level": >"error",\n\t"logger": "android-exception""
},
"input": {
"type": "log"
},
"prospector": {
"type": "log"
},
"beat": {
"hostname": "LYNCHC18",
"version": "6.4.0",
"name": "LYNCHC18"
},
"host": {
"name": "LYNCHC18"
},
"source": "G:\kregsTestLog2.log"
}

If I condense the json message into one line, the json is parsed and sent correctly. I'm hoping to avoid the requirement to condense all logs into one line for readability purposes, as my logs are ingested by multiple people, not all of which use Kibana.

Am I misunderstanding the documentation, and filebeat doesn't actually support multiline json parsing? Here's my current configuration:

filebeat.inputs:

  • type: log
    enabled: true
    paths:
    • G:\kregsTestLog2.log
      json.message_key: messageKey
      json.keys_under_root: false
      json.add_error_key: true
      multiline.pattern: '^{'
      multiline.negate: true
      multiline.match: after

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

setup.kibana:

output.elasticsearch:
hosts: ["search-craigs-test-elasticsearch-uhvnj6zt5s3px2rldniycjslga.us-east-> 2.es.amazonaws.com:443"]
protocol: "https"

Thanks so much in advance for your help!

What do you mean by multiline JSON parsing? Aggregating the lines under messageKey?

If yes, you need to set json.keys_under_root to true. This way Filebeat puts the message "Hello!" under the root of the event as messageKey. Then you can do the multiline aggregation based messageKey.

I'm just trying to get similar functionality whether the event is multi-line or on a single line. The following works without issue:

{ "messageKey": "Hello!", "blahKey": "blahValue", "kregsKey": "Hello!", "event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81", "level": "error", "logger": "android-exception" }

In Elasticsearch (Kibana), the record shows up like so:

But as soon as I add additional lines like so:

{
"messageKey": "Hello!",
"blahKey": "blahValue",
"kregsKey": "Hello!",
"event_id": "32cebd5d-1542-4703-a5a3-be5eaa90af81",
"level": "error",
"logger": "android-exception"
}

I get the error "Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}"

It might also be worth mentioning that that bottom error invalid character '}' looking for beginning of value is what shows up as the Elasticsearch error:

Hopefully this makes things clearer. I don't actually care about the messageKey key in the json - I only added that to try to get things working, since if I understand the documentation correctly:
You can combine JSON decoding with filtering and multiline if you set the "message_key" option - Then the message key appears to be necessary if I want to capture multiline json logs.

By the way, I did try setting json.keys_under_root to true and am getting the same error :frowning:

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.