Create nested object mapping using logstash unloading from postgres

Hi, I am unloading data from Postgres to Elasticsearch using logstash. In this case if the index is not there in Elasticsearch logstash is creating an index with the predefined mapping. The mapping is perfect for all the data.

But for one specific array of objects in the document I need that to be nested datatype. Can do that with some filter of logstash

For example

"members": {
          "properties": {
            "applicant": {
              "type": "text",
              "fields": {
                "keyword": {
                  "type": "keyword",
                  "ignore_above": 256
                }
              }
            },
            "application_dt": {
              "type": "date"
            }
         }
 }

This is the above mapping of one array key that is members. I want this to automatically be a nested datatype on creation
It looks like this

{
....,
members: [
    {applicant: 'Name', application_dt: '2009-01-01'},
    ....
]
....
}

DId you create a template or are you letting Elasticsearch create the mapping?

If you create a template, then you will need to change the mapping in your template, if you are letting Elasticsearch create the mapping, then you will need to create an index template to apply the correct mapping.

So either way I have to create a template for it I guess. It cannot be done without the template or by using some filters or changing the value to some specific datatype. I have to maintain a template before creating records in Elasticsearch correct?

Yes, if you want that field to be of nested type, you will need to create a template for it before creating the index.

You can then configure logstash to apply this template when sending data.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.