JSON research and filter in array


I am trying to figure out how to filter on some JSON field and display only the corresponding field but it is like because of my JSON format (only one line with everything inside) it is not working.
Any idea how can I do that?

I am trying to filter on series.fieldData.fieldId:189 to create one line and for example series.fieldData.fieldId:198 to show another line.

I got the same result if I use the first or second filter and I think because it considers as the same JSON input.
is working with filter in logstasg the only way to format JSON differentyl ?

Many thanks

{"intervalData": {"endTime": "2016-07-08T17:01:00Z", "intervals": ["2016-07-08T17:00:00Z"], "startTime": "2016-07-08T17:00:00Z", "intervalDurationSeconds": 60}, "fieldGroups": [], "series": [{"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'HTTPS/TCP'", "string": "HTTPS (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 2247173}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 175.536337}]}, {"fieldId": 198, "data": [{"status": "VALID", "float": 62.338743}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'VPX PRIMEXIS/TCP'", "string": "VPX PRIMEXIS (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 2218588}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 1301.487157}]}, {"fieldId": 198, "data": [{"status": "INVALID", "float": 0.0}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'Other/TCP'", "string": "Other (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 1181386}]}, {"fieldId": 189, "data": [{"status": "INVALID", "float": 0.0}]}, {"fieldId": 198, "data": [{"status": "INVALID", "float": 0.0}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'Citrix CGP/TCP'", "string": "Citrix CGP (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 396249}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 179.666869}]}, {"fieldId": 198, "data": [{"status": "INVALID", "float": 0.0}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'SMTP/TCP'", "string": "SMTP (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 332291}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 33.730212}]}, {"fieldId": 198, "data": [{"status": "VALID", "float": 11.715309}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'Citrix ICA/TCP'", "string": "Citrix ICA (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 188940}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 47.903104}]}, {"fieldId": 198, "data": [{"status": "INVALID", "float": 0.0}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'HTTP/TCP'", "string": "HTTP (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 162387}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 615.022797}]}, {"fieldId": 198, "data": [{"status": "VALID", "float": 30.162623}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'SSH/TCP'", "string": "SSH (TCP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 101810}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 711.933809}]}, {"fieldId": 198, "data": [{"status": "VALID", "float": 389.164565}]}]}, {"legend": [{"fieldId": 16, "data": {"status": "VALID", "filterKey": "app 'DNS/UDP'", "string": "DNS (UDP)"}}], "fieldData": [{"fieldId": 53, "data": [{"status": "VALID", "unsigned": 30108}]}, {"fieldId": 189, "data": [{"status": "VALID", "float": 8.144228}]}, {"fieldId": 198, "data": [{"status": "INVALID", "float": 0.0}]}]}]}

Auto reply ...
I found it is very simple only need to split the series thanks to logstash filter function :

filter {
split {
field => "series"

Glad you figured it out. Formatting the data in Logstash before the data gets indexed is the best way to solve this.

Yep thanks :slight_smile:
Now I have another issue perhaps not too much complicated but I am unable to understand why...

After spliting in logstash I got individual JSON object like this

It it pretty cool as I am able to filter on series.legend.filterKey as each individual JSON line has his proper Key so I can differentiate my graphics based on this criteria.

Now I am trying to build a chart and filter on two fields (series.fieldData.fieldId).
Value 189 represents a response time and Value 198 another delay, I would like to show them on the same chart but when I filter on both the graphic is the same two times... It is like it is only able to get the first value but not enter into the 198 data.
Same thing if I try to build two different chart each filtered indivitualy on series.fieldData.fieldId:198 / series.fieldData.fieldId:189.

I see that I have array again but with only one field so I guess I don't need to split it again in logstash ?

Update : I also noticed that I got a warning message : Objects in arrays are not well supported... I think the problem is coming from here

Many thanks for your help

Sounds like you want to be able to create a visualization based on data with nested objects. Unfortunately, this type of aggregation is not available in Kibana for visualization. The issue is being tracked here: https://github.com/elastic/kibana/issues/1084

sounds like you are right... many thanks