Logstash jdbc plugin - nested data

I am in process of attempting to migrate from using the old jdbc river plugin to using logstash 1.5 / logstash jdbc plugin

I am however running into some difficulty, if i previously had 50 rows with 20 having unique id's,
like 20 customers, but several customers have more than one mailing address

the results would fold the sql into nested fields for my document during bulk insertion

{
            "_index": "matchstick",
            "_id": "0000117947",
            "_score": 1,
            "_source": {
               "id_number": "0000117947",
               "pref_mail_name": "Joanne Reaume",
               "street1": [
                  "364 Lexington Cir",
                  "1751 Hawthorne Rd"
               ],
               "street2": " ",
               "street3": " ",
               "city": [
                  "Romeo",
                  "Grosse Pointe Woods"
               ],
               "state_code": "MI",
               "zipcode": [
                  "48065-3006",
                  "48236-1447"
               ],
            }
         },

when i run the logstash-jdbc-plugin, it seems to just replace the document each time a row is encountered , creating a new version, not creating any nested fields

{
   "_index": "matchstick",
   "_type": "logs",
   "_id": "0000117947",
   "_version": 2,
   "found": true,
   "_source": {
      "id_number": "0000117947",
      "pref_mail_name": "Joanne Reaume",
      "street1": "1751 Hawthorne Rd",
      "street2": " ",
      "street3": " ",
      "city": "Grosse Pointe Woods",
      "state_code": "MI",
      "zipcode": "48236-1447",
      "first_name": "Joanne",
      "middle_name": " ",
      "last_name": "Reaume",
      "email_address": null,
      "@version": "1",
      "@timestamp": "2015-11-05T18:13:56.780Z"
   }
}

is there any way to fold query results to nested fields?

@jrizzi1

Hi, have you solved this issue ? I am facing a similar situation.
Kindly revert if you have a solution . Thanks!

Any luck? I have the same issue.

No luck ! Solved this issue by avoiding Logstash and building a custom ETL tool.

I did the same, built a custom ETL tool. However, elasticsearch always aggregates at the document level, never at the nested object level, so the value of nested objects is a bit more limited than I had thought.

@magnusbaeck That is something that should have been a part of the jdbc plugin as well right? its pretty handy as we do not have to go to third party apps to take care of data uploads to elastic in a certain structure.

@Architha can you please share the ETL here?

@jrizzi1 can you share the jdbc river config ?

my river config gets the same issue im describing, it hasnt been fixed as
there's been no movement towards fixing it