Filebeat Error decoding JSON: json: cannot unmarshal string into Go value of type map

Hello,

I have Filebeat 6.5.0, setup with the following filebeat.yml
> filebeat.inputs:
> - type: log
> enabled: true
> paths:
> - "C:/logs/info/*registry2json.json"
> scan_frequency: 10s
> json.keys_under_root: true
> json.add_error_key: true
> json.messsage_key: message
> tags: registry_json
> # close_inactive: 1m
> clean_removed: true
>
> - type: log
> enabled: true
> paths:
> - "C:/logs/info/lexmark_printer.json"
> scan_frequency: 10s
> json.keys_under_root: true
> json.add_error_key: true
> json.messsage_key: printerList
> tags: dev-lexmark-printer
> # close_inactive: 1m
> close_eof: true
> close_removed: true
>
> - type: log
> enabled: true
> paths:
> - "C:/logs/info/btr.json"
> scan_frequency: 1m
> json.keys_under_root: true
> # json.add_error_key: true
> # json.messsage_key: message
> tags: dev-btr
> # close_inactive: 1m
> close_removed: true
>
> - type: log
> enabled: true
> paths:
> - "C:/logs/info/TOAST.json"
> scan_frequency: 10m
> json.keys_under_root: true
> json.add_error_key: true
> json.messsage_key: message
> tags: dev-toast
> # close_inactive: 1m
> # close_eof: true
> close_removed: true
> #----------------------------- Logstash output
> when:
> contains:
> tags: registry_json
> output.logstash:
> hosts: "logstash:5044"
> ssl.enabled: true
> ssl.verification_mode: full
> ssl.certificate: "certfilehere.crt"
> ssl.key: "certfilehere.key"
>
> when:
> contains:
> tags: dev-lexmark-printer
> logstash.output:
> hosts: "logstash:5040"
> ssl.enabled: true
> ssl.verification_mode: full
> ssl.certificate: "certfilehere.crt"
> ssl.key: "certfilehere.key"
>
> when:
> contains:
> tags: dev-btr
> logstash.output:
> hosts: "logstash:5041"
> ssl.enabled: true
> ssl.verification_mode: full
> ssl.certificate: "certfilehere.crt"
> ssl.key: "certfilehere.key"
>
> when:
> contains:
> tags: dev-toast
> logstash.output:
> hosts: "logstash:5042"
> ssl.enabled: true
> ssl.verification_mode: full
> ssl.certificate: "certfilehere.crt"
> ssl.key: "certfilehere.key"

For all the files, except for the registry2json.json, I am getting the following errors:
ERROR json/json.go:51 Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}

This is happening to all lines in the the files. I have verified that the json files are formated correctly.

What am I setting incorrectly? Any help would be greatly appreciated.

Please format logs and configuration files using the </> button in the editor window.

The json input only supports one JSON document per line, plus requires a line delimiter to pick up the line.
Each line must be an object, no string or other data type. Potential reasons the parser fails with this message is:

  • multiline json
  • line is a json string only.

Filebeat supports one output only. What exactly do you try to achieve with your different when clauses?

Sorry about the formatting.

I will look into multiline JSON, however, the JSON files I am scanning are formatted correctly, at least by JSON lint.

I have Filebeat configured to hopefully send the files to different Logstash pipelines via tagging them. Logstash then processes and send the different files to different indices.

I am trying to not have a single massive Logstash pipeline filter as that gets expensive with filtering.

Would pipeline to pipeline be a better way? Just send everything from Filebeat to one Logstash pipeline, and then have that pipeline "sort" by some filter, then ship to the "processing/filtering" pipeline(s)? A distributor pipeline to pipeline configuration?

Also, how does pipeline to pipeline work with centralized pipeline management?d

EDIT: I answered my question about Logstash and pipeline to pipeline. Still trying to solve the original error in Filebeat.

json linters do not care about the exact formatting (e.g. pretty printed, multiline). Plus json linters do not care about the actual data type. Filebeat does, as filebeat requires you to have one complete JSON object per line, plus it must be an object.

Try to find a minimal registry2json.json (at best one event only) that reproduces the problem. If we can reliably reproduce it we can have a look at the file + log to see what the actual problem is.

@steffens
registry2json.json does not error, but the TOAST.json does. Here is a sample of what we are trying to ship:

{
  "Version": "2.20.0.104",
  "TimeStamp": "2/26/2019 9:08:29 AM",
  "Computers": "20",
  "Printers": "1",
  "Other": "0",
  "Unknown": "0",
  "Computer": [
    {
      "Name": "09280-SERVER-X0",
      "IP": "192.168.11.25",
      "MAC": "macaddiehere",
      "OSName": "Microsoft Windows Server 2016 Standard",
      "OSReleaseID": "1607",
      "TimeZoneName": "Central Standard Time",
      "LANTaxServer": "09280-SERVER-X0",
      "Habitat": "QA",
      "MarimbaEnv": "QA",
      "OfficeId": "9280",
      "Ownership": "COMPANY",
      "Ownership2": "C",
      "Role": "SERVER",
      "OfficeType": "SMALL_OFFICE",
      "Manufacturer": "major maker",
      "Model": " TWR",
      "Serial": "Serialhere",
      "Total_RAM_GB": "32",
      "CPU_Name": "Intel(R) Core(TM) i5-6500 CPU @ 3.20GHz",
      "CPU_Speed_MHz": "3192",
      "Is64BitOS": "true",
      "HD_Size_GB": "476",
      "HD_PartitionCount": "3",
      "HD_ActivePartition": "2",
      "CDrive_Size_GB": "97",
      "CDrive_FreeSpace_GB": "40",
      "SDrive_FreeSpace_GB": "176",
      "NetSpeed_Mbit": "1000",
      "HasDebitPad": "false",
      "HasMagtek": "false",
      "HasUPS": "false",
      "CATSVersion": "19.5.66.62",
      "FIDOVersion": "2.2.0.7",
      "OpenOfficeVersion": "",
      "Version": "9.0.04m",
      "Version1": "10.4.1.490",
      "NonNetworkPrinter": [
        {
          "name": "Microsoft Print to PDF",
          "model": "Unknown",
          "driver": "Microsoft Print To PDF",
          "port": "PORTPROMPT:",
          "isShared": "false",
          "setAsDefaultEntry": "Microsoft Print to PDF,winspool,Ne01:"
        },
        {
          "name": "Microsoft XPS Document Writer",
          "model": "Unknown",
          "driver": "Microsoft XPS Document Writer v4",
          "port": "PORTPROMPT:",
          "isShared": "false",
          "setAsDefaultEntry": "Microsoft XPS Document Writer,winspool,Ne00:"
        }
      ],
      "Driver": [
        {
          "name": "Microsoft XPS Document Writer v4"
        },
        {
          "name": "Microsoft Print To PDF"
        },
        {
          "name": "Source Technologies Universal v2"
        },
        {
          "name": "Remote Desktop Easy Print"
        },
        {
          "name": "Microsoft enhanced Point and Print compatibility driver"
        },
        {
          "name": "Lexmark Universal v2 XL"
        },
        {
          "name": "Lexmark MS810 Series XL"
        },
        {
          "name": "Lexmark HBP T652"
        },
        {
          "name": "Microsoft enhanced Point and Print compatibility driver"
        }
      ],
      "UserProfile": [
        {
          "Username": "DEFAULT",
          "key": "DEFAULT-PROFILE",
          "profilePath": "c:\\users\\default",
          "defaultPrinter": ""
        }
      ]
    }
  ]
  "NetworkPrinter": {
    "Name": "12112TAXPR2",
    "MAC": "MACaddiehere",
    "IP": "192.168.11.192",
    "IpSource": "DHCP",
    "Model": "Standard Printer",
    "Serial": "1234Serial",
    "FirmwareVersion": "something"
  },
  "Router": {
    "routerIP": "192.168.11.1",
    "make": "Major Label",
    "model": "Model",
    "mac": "MACaddiehere",
    "serial": "Serialhere",
    "firmware": "n/a",
    "wanIP": "WANIPHERE",
    "name": "QA Router 45"
  }
}

I cannot find anything wrong with the formatting.
Any suggestions would be greatly appreciated.

Thanks

This is a multiline pretty-printed json document/file. Filebeat first splits content into lines and then tries to parse line by line as a single JSON document. Multiline JSON documents are not supported.

The actual format supported by filebeat is also known as ndjson. Using ndjson makes the processing somewhat more robust in case you have an invalid document in your file (not every log file, custom json encoder is always compliant unfortunately). With ndjson one can drop invalid events, but still continue processing.

You can either try to use the multiline filter, waiting for the closing }, so to present the complete document to the JSON parser or change the application logging configuration so to create an ndjson compliant file.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.