I have log in following format, it is a plain json with nested fields.
{
"level": "info",
"message": {
"req": {
"headers": {
"host": "localhost:8080",
"connection": "keep-alive",
"x-forwarded-for": "192.168.1.1, 1.1.1.1",
"x-forwarded-proto": "http"
},
"url": "/products?userId=493d0aec-a9a7-42a3",
"method": "GET",
"originalUrl": "/products?userId=493d0aec-a9a7-42a3",
"params": {
"0": "/products"
},
"query": {
"userId": "493d0aec-a9a7-42a3"
},
"body": ""
},
"res": {
"headers": {
"traceid": "ac586e4e924048",
"x-correlation-id": "57d7920d-b623-48f8",
"content-type": "application/json;charset=UTF-8",
"content-length": "2",
"date": "Fri, 08 Mar 2019 09:55:45 GMT",
"connection": "close"
},
"statusCode": 200,
"body": "[]"
},
"gateway": "internal"
},
"correlationId": "57d7920d-b623-48f8",
"timestamp": "2019-03-08T09:55:45.833Z"
}
How can I parse it correctly using Filebeat and Logstash to see all json fields in Kibana as separate (parsed) fields? I have a problem with "message" field which has nested json fields. I have no problem to parse an event which has string in "message", but not json.
My attempts:
1 . I tried to tell Filebeat that it is a json with following configuration:
(and doing nothing on LS side)
filebeat.inputs:
- type: stdin
json.keys_under_root: true
json.add_error_key: true
The result is strange for me, because I got "message" as a string in Kibana where all :
are replaced with =>
{
"req" => {
"originalUrl" => "/offers", "params" => {
"0" => "/offers"
}, "query" => {}, "body" => "", "headers" => {
"accept-encoding" => "gzip", "user-agent" => "okhttp/3.8.1", "x-consumer-id" => "f2a6e4cd-2224-4535
Other fields outside the "message" are parsed correctly
2 . I did nothing on Filebeat side and use filter in LS:
json {
source => "message"
target => "message_json"
}
Logs are not appeared in Kibana at all, I got following errors in LS:
[2019-03-08T09:55:47,084][WARN ][logstash.outputs.elasticsearch] Could
not index event to Elasticsearch. {:status=>400, :action=>["index",
{:_id=>nil, :_index=>"filebeat-6.5.0-2019.03.08-sdx", :_type=>"doc",
:routing=>nil}, #LogStash::Event:0x1e1c3ea9],
:response=>{"index"=>{"_index"=>"filebeat-6.5.0-2019.03.08-sdx",
"_type"=>"doc", "id"=>"ERS6XGkBgE-US7A6Mvt", "status"=>400,
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to
parse field [json.message] of type [keyword]",
"caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get
text on a START_OBJECT at 1:461"}}}}} [2019-03-08T09:55:47,085][WARN
][logstash.outputs.elasticsearch] Could not index event to
Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil,
:_index=>"filebeat-6.5.0-2019.03.08-sdx", :_type=>"doc",
:routing=>nil}, #LogStash::Event:0x38ee052e],
:response=>{"index"=>{"_index"=>"filebeat-6.5.0-2019.03.08-sdx",
"_type"=>"doc", "id"=>"EhS6XGkBgE-US7A6Mvt", "status"=>400,
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to
parse field [json.message] of type [keyword]",
"caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get
text on a START_OBJECT at 1:461"}}}}}
This filter works fine for me if the "message" field is a string (not a json).
Any ideas how to parse nested json in "message field"?