Parsing nested JSON


(Ashay) #1

Hi

I have a JSON file as shown below. I want to read this file using logstash. Each log file i receive has only 1 JSON structure as shown below and nothing new gets appended to the log file. How can i combine these multiple lines in a single line to be parsed using JSON plugin in logstash?

{
"Stage1":
{
"Level1":
{
"field1": "True",
"field2": "True",
"field3": "True"
}
},
"Stage2":
{
"decimal":
{
"field3": "0",
"field4": "abc",
"field5": "True"
}
}
}

Below is a copy of my logstash config file

input
{
file
{
codec => multiline
{
pattern => '^}'
negate => true
what => next
}
path => ["/mnt/nfs/qa/JSONS/*.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter
{
mutate
{
gsub => [ 'message','\n','']
gsub => [ 'message','\s','']
replace => [ "message", "%{message}" ]
}

}
output
{
stdout {}
}


#2

That multiline codec will result in the file being two events, one of which contains '{"Stage1":{"Level1":{"field1": "True","field2": "True","field3": "True"}},"Stage2":{"decimal":{"field3": "0","field4": "abc","field5": "True"}}' and the other one containing the final '}'. Neither one is valid JSON as-is. It would be possible to gsub the final } onto the first line and drop events which just contain }. However, that still leaves logstash tailing the files after it has read them, so you have to know when to kill it.

An alternative is to transform the files outside of logstash and use 'input { stdin { codec => json_lines } }'. Something like this...

( for F in /mnt/nfs/qa/JSONS/*.json ; do
  ( cat $F | tr '\n' ' ' ; echo ) 
done ) | /usr/share/logstash/bin/logstash -f ...

(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.