How to parse custom logs with nested jsons

How to parse the log files as below

2017-07-09 18:42:53,748 || [INFO] || default || || some_text || some_text || /path/to/some/python/file.py || 919 || {"key1": "value1", "key2": {"key3": "value3", "key4": "value4", "key5": "value5"}, "key6": "/some/url/"}

and store each key separately with grok.

actual logs have different set of JSONS with different keys.

It looks like you should be able to use a csv filter based on the | separator to parse the different parts into separate fields. You can the apply a son filter to the last field that contains the JSON document.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.