Logstash cannot extract json key


(t.b) #1

Hi guys, I need help regarding logstash filter to extract json key/value to new_field. The following is my logstash conf.

input {
	tcp {
		port => 5044
	}
}

filter {
	json {
		source => "message"
		add_field => {
			"data" => "%{[message][data]}"
		}
	}
}

output {
		stdout { codec => rubydebug }
}

I have tried with mutate:

filter {
    json {
        source => "message"
    }
    mutate {
        add_field => {
            "data" => "%{[message][data]}"
        }
    }
}

I have tried with . instead of [ ]:

filter {
    json {
        source => "message"
    }
    mutate {
        add_field => {
            "data" => "%{message.data}"
        }
    }
}

I have tried with index number:

filter {
    json {
        source => "message"
    }
    mutate {
        add_field => {
            "data" => "%{[message][0]}"
        }
    }
}

All with no luck. :frowning:

The following json is sent to port 5044:

{"data": "blablabla"}

The problem is the new field not able to extract value from the key of the json.

"data" => "%{[message][data]}"


The following is my stdout:

{
           "@version" => "1",
               "host" => "localhost",
               "type" => "logstash",
               "data" => "%{[message][data]}",
               "path" => "/path/from/my/app",
         "@timestamp" => 2019-01-11T20:39:10.845Z,
            "message" => "{\"data\": \"blablabla\"}"
}

However if I use "data" => "%{[message]}" instead:

filter {
    json {
        source => "message"
        add_field => {
            "data" => "%{[message]}"
        }
    }
}

I will get the whole json from stdout.

{
           "@version" => "1",
               "host" => "localhost",
               "type" => "logstash",
               "data" => "{\"data\": \"blablabla\"}",
               "path" => "/path/from/my/app",
         "@timestamp" => 2019-01-11T20:39:10.845Z,
            "message" => "{\"data\": \"blablabla\"}"
}

Can anyone please tell me what I did wrong.

Thank you in advance.

I use docker-elk stack, ELK_VERSION=6.5.4


#3

I don't believe your json filter is working because you really just have a Key/Value pair. Simply add the json_lines codec to you input....

input {
tcp {
port => 5044
codec => "json_lines"
}
}


(t.b) #4

I did just that but it didn't work.

I found the solution to this issue, it was simply a nested json.

 message: {
     message: {
     }
 }

all I did was to use target.

filter {
  json {
    source => "message"
    target => "message"
  }
}

filter {
  json {
    source => "data[message]"
    target => "message"
  }
}

(t.b) #5

I did just that but it didn't work.

I found the solution to this issue, it was simply a nested json.

 message: {
   message: {
   }
 }

all I did was to use target.

filter {
  json {
    source => "message"
    target => "data"
  }
}

filter {
  json {
    source => "data[message]"
    target => "message"
  }
}