Configure Logstash to create an Elasticsearch document with nested arrays

I'm indexing my PostgreSQL data for Elasticsearch using the Logstash JDBC Input Plugin. I have two tables called REQUEST and ASSIGNMENT listed below.

How can I use Logstash to index the two tables into one Elasticsearch document of type REQUEST with a nested arrays for all child ASSIGNMENT records? Is Grok something that could help me out?

Table: REQUEST

REQUEST_ID | POC
---------- | ----------------
1234       | Jon Snow
1256       | Tyrion Lannister


Table: ASSIGNMENT

ASSIGN_ID | REQUEST_ID | STATUS  | CREATED
--------- | ---------- | ------- | ----------
2345      | 1234       | New     | 2017-01-06
2364      | 1234       | Working | 2017-03-12
2399      | 1234       | Working | 2017-05-20
5736      | 1256       | New     | 2017-06-28

This is what I want my Elasticsearch document to look like. It is a sample of the _source value of the search result:

"_source": {
  "request_id": "1234",
  "poc": "Jon Snow",
  "assignments": [
    {
      "assign_id": "2345",
      "status": "New",
      "created": "2017-01-06"
    },
    {
      "assign_id": "2364",
      "status": "Working",
      "created": "2017-03-12"
    },
    {
      "assign_id": "2399",
      "status": "Working",
      "created": "2017-05-20"
    }
  ]
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.