Issue : Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}

Hi,

i'm using the elk stack 7.1.1 with x-pack installed and i trying to parse json log into logstash and apply grok filter to it then im getting the following error

Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}

I have attached my configurations below please help me understand where i'm doing it wrong.

sample log :

{"log":"2019-10-01 07:18:26:854*[DEBUG]*cluster2-nio-worker-0*Connection*userEventTriggered*Connection[cassandraclient/10.3.254.137:9042-1, inFlight=0, closed=false] was inactive for 30 seconds, sending heartbeat\n","stream":"stdout","time":"2019-10-01T07:18:26.85462769Z"}

My output :

{
         "input" => {
        "type" => "log"
    },
           "log" => {
          "file" => {
            "path" => "/home/Desktop/a.log"
        },
        "offset" => 274
    },
      "@version" => "1",
    "@timestamp" => 2019-10-04T06:53:26.046Z,
         "error" => {
           "type" => "json",
        "message" => "Error decoding JSON: json: cannot unmarshal string into Go value of type map[string]interface {}"
    },
          "tags" => [
        [0] "kubelogs",
        [1] "beats_input_raw_event",
        [2] "_grokparsefailure"
    ],filebeat.inputs: 
           "ecs" => {
        "version" => "1.0.0"
    },

my filebeat configuration

filebeat.inputs:

- type: log
  enabled: true
  paths:
    - /home/Desktop/a.log
  json.keys_under_root: true
  json.add_error_key: true
  json.message_key: log	
  tags: ["kubelogs"]
 output.logstash:
  hosts: ["localhost:5044"]

My logstash configuration

input {
    beats {
        port => "5044"
    }
}


filter {

       grok {
       match => [ "message", '{"log":"%{GREEDYDATA:messagedata}","stream']   
       }
 }


output {
elasticsearch { 
	hosts => ["localhost:9200"] 	
	index => "logstash-%{+YYYY.MM.dd}"
	manage_template => false		
	user => elastic
        password => ##########
	}
stdout { codec => rubydebug }
}

Hey,
you should try to add a processor section in your filebeat config for decoding the json data

processors:

  • decode_json_fields:
    fields: "log"
    overwrite_keys: true

thanks @juka , it worked.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.