Parsing json data which has list of arrays inside array

Hi All,

I am new to logstash, i have a requirement to parse the huge json file. Can someone help me out here?
Is it possible to parse the below content using grok script?
Below is the sample file content:
{
"environments": [ {
"dimensions": [
{
"metrics": [
{
"name": "sum(target_response_time)",
"values": [ {
"timestamp": 1520463600000,
"value": "146.0"
}]
},
{
"name": " sum(total_response_time)",
"values": [ {
"timestamp": 1520463600000,
"value": "170.0"
}]
},
{
"name": "sum(response_size)",
"values": [ {
"timestamp": 1520463600000,
"value": "108.0"
}]
}
],
"name": "66504ed5-0378-d1b5-c95e-abd6a2ef6d45"
},
{
"metrics": [
{
"name": "sum(target_response_time)",
"values": [ {
"timestamp": 1520463600000,
"value": "221.0"
}]
},
{
"name": " sum(total_response_time)",
"values": [ {
"timestamp": 1520463600000,
"value": "222.0"
}]
},
{
"name": "sum(response_size)",
"values": [ {
"timestamp": 1520463600000,
"value": "168.0"
}]
}
],
"name": "3d651b1e-fce3-0b31-c128-f4b245fddda5"
}
],
"name": "dev"
}],
"metaData": {
"errors": [],
"notices": [
"source pg:33bd822f-967b-4454-9572-29257cb112f8",
]
}
}

Any quick help is appreciated.

Hi @kumarkar,

Why you need to write the grok ?
In filter plugin already we have json filter it will parse the json data just refer this link:

https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html

otherwise if you have any file then try this little example it may work:

input
{
file
{
path => "C:\Test\file.json" # your file path
type => "json"
sincedb_path => "dev/null"
start_position => "beginning"
codec => "json"
}
}

filter
{
json
{
source => "message"
}

}

output
{
stdout { codec => rubydebug }
elasticsearch
{
hosts => ["localhost:9200"]
index => "test-%{+YYYY.MM.dd}"

            }

}

Thanks & Regard,
Krunal.

Hi @Krunal_kalaria

I tried the way you specified and outputing to file, i see it is trying to parse line by line and failing, below is the few line of output file.
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":"{\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:13.888Z"}
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":" "environments": [ {\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:13.945Z"}
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":" "dimensions": [\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:13.956Z"}
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":" {\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:13.972Z"}
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":" "name": " sum(total_response_time)",\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:14.305Z"}
{"path":"D:\ELK\Test\test.json","tags":["_jsonparsefailure"],"message":" "values": [ {\r","type":"json","@version":"1","host":"A2ML20110","@timestamp":"2018-03-15T12:40:14.371Z"}

Use a multiline codec to join the lines of the file to a single event (you should be able to find a working example in previous posts), then use a json filter to parse the resulting JSON string.

Hi @magnusbaeck,

i tried with the below, but it is not parsing the file, if possible can you please provide me some sample code:

input
{
file
{
path => "D:\ELK\Test\test.json"
codec => multiline {
pattern => "^}"
negate => true
what => next
max_lines => 20000
}
}
}

filter{
mutate { gsub => ["message", "\n", ""] }
json{ source => "message" }
split{ field => "Records" }
}

output
{
stdout { codec => rubydebug }
file
{
path => "D:\ELK\Test\Out.json"
create_if_deleted => "true"
flush_interval => "0"
}
}

Regards,
Ravi.

can someone help me here?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.