Merge separate json objects output to an array

I am querying an elastic search instance.

I end up with a series of json objects.

{"business_id": "1",
 "Accepts Credit Cards": true,
 "Price Range": 1,
 "type": "food"}
{"business_id": "2",
 "Accepts Credit Cards": true,
 "Price Range": 2,
 "type": "cloth"}
{"business_id": "3",
 "Accepts Credit Cards": false,
 "Price Range": 3,
 "type": "sports"}

On using the the json codec format, it remains as a list of separate json objects.

I want to finish with an array of json objects instead.

{"some_name": [ {"business_id": "1",
"Accepts Credit Cards": true,
"Price Range": 1,
"type": "food"}
{"business_id": "2",
"Accepts Credit Cards": true,
"Price Range": 2,
"type": "cloth"}
{"business_id": "3",
"Accepts Credit Cards": false,
"Price Range": 3,
"type": "sports"}]}

Are these separate documents, or multiple objects in the same document?

Multiple objects in the same document.

I'm guessing that one would use filter -> aggregate for this?

It might be easier to do it using ruby. Is there any way you can show us an example document? Either from stdout { codec => rubydebug } or from the JSON from Discover in Kibana.

File is as posted above
Seperate java objects
{"id":"123", "array1":["val1", "val2"], "array2":["val3", "val4"]}
{"id":"234", "array1":["val11", "val22"], "array2":["val3", "val4"]}
{"id":"345", "array1":["val71", "val22"], "array2":["val3", "val4"]}
and instead I want to have an array of objects
{"name1": [{"id":"123", "array1":["val1", "val2"], "array2":["val3", "val4"]},
{"id":"234", "array1":["val11", "val22"], "array2":["val3", "val4"]},
{"id":"345", "array1":["val71", "val22"], "array2":["val3", "val4"]}]}

What I have tried

  1. include a new name-value pair with the same value
    {"index":"sw", "id":"123", "array1":["val1", "val2"], "array2":["val3", "val4"]}
    {"index":"sw","id":"234", "array1":["val11", "val22"], "array2":["val3", "val4"]}
    {"index":"sw","id":"345", "array1":["val71", "val22"], "array2":["val3", "val4"]}
    then try and aggregate on this
    filter {
    ....
    mutate {
    aggregate => {
    task_id => %{index}
    map['id'] => event.get('id')
    map['array1'] => event.get('array1')
    map['array2'] => event.get('array2')
    }
    }
    ....
    }

but am getting an error.

I am unable to determine what error you are getting.

If I understand that right, you put an aggregate filter inside of a mutate filter and the did not wrap the code for the aggregation in code => "". It looks pretty chaotic, to be honest...

Yes, thanks Badge and Jenni fro replying.

On review, I think that I am misusing/abusing logstash. Logstash is for streams of data, hence why I'm getting a series of json objects.

I shouldn't trying to do what I'm doing as I'm trying to take ALL of my logstash events and put them in a json string. Which goes against the logstash use of data streams.

Thanks for the replies.
Mods - you can close this now if you wish.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.