Can logstash handle array of nested objects?

I am unable to configure logstash to load an array of nested objects as described in the elasticsearch guide.

I can map and load the array of nested objects via the elasticsearch REST API (as shown below), but I am unable to do the same via logstash when there is an unknown number of nested objects in the array (in this case, an unknown number of comments).

Any suggestions?

PUT /my_index/blogpost/1
{
  "title": "Nest eggs",
  "body":  "Making your money work...",
  "tags":  [ "cash", "shares" ],
  "comments": [ 
    {
      "name":    "John Smith",
      "comment": "Great article",
      "age":     28,
      "stars":   4,
      "date":    "2014-09-01"
    },
    {
      "name":    "Alice White",
      "comment": "More like this please",
      "age":     31,
      "stars":   5,
      "date":    "2014-10-22"
    }
  ]
}

What problems are you having, specifically?

If I try to load the comments field as a stringified json object, I get this error:

"object mapping for [comments] tried to parse field [comments] as object, but found a concrete value"

I cannot use mutate to rename the field to object properties because there is an unknown number of comments.

But why would you want to store that field as serialized JSON? The error message above means that the comments field has been mapped as an object but you're trying to index a document with that field being a non-object, in your case probably a string. If you really want to store a string (it sounds ill-advised to me) you need to reindex and make sure comments is mapped as a string.

It would probably be helpful if you showed us an example Logstash event. Then we can see if we can turn that into the kind of document you indexed manually with a REST call.

I do not want to store the field as serialized JSON. I want to store it as a nested object. Posting the string via the REST API stores the object perfectly, which is why I attempted to pass it in as a string via logstash.

The event is the same as the example above where originally each comment is a separate row in a database. There may be an unknown number of comments per document.

It's still not clear to me what the problem is. You're pulling multiple rows from a database but can't join them into a single event that you can pass to ES? Or you already have the comments in a single event (like in the example)? We'll save everyone's time if you can be explicit about your configuration and what your events currently look like (we know what you want them to look like).

The events currently look the same as the example above where a "blog" table has a row with the title, body and tags and a "comments" table contains name, comment, age, stars and date and may have multiple rows associated with each blog row. I can flatten the multiple rows into a single event row using SQL. The question is how do I configure logstash to load this into ES, if possible, when there is an unknown number of nested comment objects.

I can do this using the ES java client. I can do this with a custom REST client. I would prefer to use logstash if possible.