Nested JSON parsing issues


I'm using filebeat and elasticsearch 7.5.2 and i am trying to stream specific lines from a local log file to ES.
The lines i am interested in have the following format

{"level":30,"time":1579750597224,"pid":32172,"hostname":"abc","name":"sdk","data":{"TYPE":"REQUEST","UUID":"47DE724F-4198-4563-9BB8-3EA2498B5E4F","METHOD":"POST","URL":"...},"msg":"[47DE724F-4198-4563-9BB8-3EA2498B5E4F] - Request on /api","v":1}

My filebeat config looks like this

- type: log
  enabled: true
  - /path/to/log/*output.log
- drop_event:
              message: '^(.*?)(Request|Response)(.*)'
- decode_json_fields:
  fields: ["message"]
  process_array: false
  max_depth: 1
  target: ""
  overwrite_keys: true
  add_error_key: true

I only want lines that have the nested "data" JSON object or contain the words i regex for in the above config.

The problem is that the example above seems to drop every log line even the ones that are valid. If i omit the decode_json_fields processor the correct logs appear in elasticsearch but i get the message as a string which is useless.
If i omit the decode_json_fields and add the following to the config

json.keys_under_root: true
json.add_error_key: true

Then i get nothing in ES again.
It seems that i am missing something related to the order in which processors are executed.

Can someone help?


Hi @ipolyzois :slight_smile:

Maybe you can add a couple of lines as example? You gave good information but your issue seems a bit more hidden.

At first sight, that regex seems a bit too complex. I'd also try to avoid capture groups on it and give it a try. Maybe you can just leave it as request|response?

Hi @Mario_Castro

Here are a couple of lines. From the example i need only the second line

{"level":30,"time":1579750573929,"pid":32185,"hostname":"hostname1”,”name":"sdk","msg":"[POST /API] [endpointId = UUID-UUID] todo = 1”,”v”:3}
{"level":30,"time":1579750573931,"pid":32172,"hostname":"hostname1”,”name":"sdk","data":{"TYPE":"REQUEST","UUID”:”uuid”,”METHOD":"POST","URL":"/sdk/events”,”ee_iid:”UUID”,”API_KEY”:”key”-uuid,”HEADERS":{"x-country-code":"US","x-country-name":"United States","x-forwarded-for”:”999.999.999.999,”,”x-forwarded-proto":"https","host":"","connection":"close","content-length":"613","x-forwarded-port":"443",”accept":"*/*","authorization":"Basic ABC”,”x-signature”:”FRE”,”content-type":"application/json","x-endpoint-device":"11.4.1","x-endpoint-id”:”UUID”-HERE,”accept-language":"en-us","user-agent":"11.4.1","accept-encoding":"br, gzip, deflate","x-endpoint-app”:”endpoint”-123,”x-endpoint-sdk-version":"241","cookie":"AWSALB=b64; AWSALBTG=b”64},”TRIGGER_ON":"2020-01-23T03:36:13.931Z","BODY":[{"payload":{"action”:”something”,”data”:{“ch”:”1"}},"type”:”action”,”header":{"clientId”:”uuid”,”clientSha”:”sha1”,”unixTime":"1579750573319","deviceId”:”IDID”,”cs”:”1”,”app”:”iiii”,”platform":"ios”,”ak”:”did”,”sss”:”uuid”,”username”:”xxxx”,”av”:”1.2.3”,”ver”:”2.4.1”,}]},"msg":"Request on /api”,”v”:1}

I also tried with the simplified regex as you suggested but no logs show up in ES.

Anyone has any ideas regarding this? I have tried moving around the decode fields but nothing seems to work.

Please, If you need SLAs for your answers then thinking about a commercial support subscription makes sense, but this forum at its core is completely driven by volunteers, which should be respected by anyone posting a question.

None of the JSON you have posted are valid JSON objects. Maybe you can take a closer look at that. Be aware that " is not the same than too

Sorry, i did not mean to force anything just asking if anyone had any thoughts or came across this again.
Regarding the validity of the json, i must have mistyped when trying to sanitize the original data, if i validate the original log entry using any online json validator it passes the tests.

That is still an invalid JSON. Please, can you POST a correct JSON of what you are trying. In other words, it's not so uncommon that people think that the problem is in Filebeat when the error is in their input data.

Thanks for your help but i ended up streaming everything to logstash and it parsed without any issues.

Your information is very interesting. Thank you for sharing

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.