Parse Json data wrapped in "message" into event fields


#1

the issue is that the log in json format, sent from filebeat to logstash, the json data got wrapped in the message field and cannot be parsed into event fields.
I have read a lot of posts on the similar issue over this weekend, and I followed the online document on the configuration of logstash:
the following is my configuration:
input {
beats {
port => 5044
type => "mylog"
}
}

filter {
if [type] == "mylog" {
json {
source => "message"
}
}
}

output {
stdout { codec => rubydebug }
}

but I found that the log data is still the value of "message", and did not get parsed.
I would be appreciated it very much if anyone could shed some light.
thank you!


(Magnus Bäck) #2

This is the correct way of doing it (you could also set codec => json in the beats input). What do you get when you try this?


#3

from Kibana, I can see the json data in "message", but I cannot find keys/values from the json data in the list of the fields.
I validated the json data via /logstash -f mytest.conf manually. The json data got parsed correctly. Is there a way that the json data gets parsed without manual way? I have read many posts but I cannot figure a solution for the specific issue. I believe I must have missed something, and I need help to know what I missed.

thank you.


(Magnus Bäck) #4

Please give an example of the stdout { codec => rubydebug } output so that we can see exactly what the resulting events look like.


#5

Logstash startup completed
{
"message" => "{"test1":"testvalue1","mymessage":{"mtest1":"test"}}",
"@version" => "1",
"@timestamp" => "2016-02-03T04:42:08.507Z",
"host" => "...",
"path" => "/mytest1.log",
"test1" => "testvalue1",
"mymessage" => {
"mtest1" => "test"
}
}

elasticsearch result processed directly from logstash:
"_source":{"message":"{"test1":"testvalue1","mymessage":{"mtest1":"test"}}","@version":"1","@timestamp":"2016-02-03T04:42:08.507Z","host”:”…”,”test1":"testvalue1","mymessage":{"mtest1":"test"}}

** the json data got parsed.


the result from elasticsearch shipped from filebeat to logstash:
"_source":{"@metadata":{"beat":"filebeat","type”:”my-log"},"@timestamp":"2016-02-03T04:46:30.234Z","beat":{"hostname”:”…”,”name”:”…”},”count":1,"message":"{"test1":"testvalue1","mymessage":{"mtest1":"test"}}","offset":0,"type”:”my-log"}

** the json data did not get parsed.
I guess my question is, why does the json data not get parsed by the logstash if the json data is sent by filebeat to the logstash?

thank you!


(Magnus Bäck) #6

elasticsearch result processed directly from logstash:

What do you mean? If you use a file input in Logstash instead of Filebeat?

I don't think the evidence is consistent. How come there's a type field here:

"_source":{"@metadata":{"beat":"filebeat","type”:”my-log"},"@timestamp":"2016-02-03T04:46:30.234Z","beat":{"hostname”:”…”,”name”:”…”},”count":1,"message":"{"test1":"testvalue1","mymessage":{"mtest1":"test"}}","offset":0,"type”:”my-log"}

But not here:

{
"message" => "{"test1":"testvalue1","mymessage":{"mtest1":"test"}}",
"@version" => "1",
"@timestamp" => "2016-02-03T04:42:08.507Z",
"host" => "...",
"path" => "/mytest1.log",
"test1" => "testvalue1",
"mymessage" => {
"mtest1" => "test"
}
}

Also, the JSON payload is parsed in the second example but not in the first.


(system) #7