Pipeline processor to convert nested array to documents while reindexing

Hi Team,

Need a solution to take data from source index and have an ingest pipeline to extract data from array in each document of source index as individual document to destination index.

e.g.

main index data:

    {
        "docId" : 1,
        "lineItems" : [
             {
                "lineId" : "line_1",
                "name" : "line one"
             },
             {
                "lineId" : "line_2",
                "name" : "line two"
             }
         ]
}

Required Data in Destination Index:

[
    {
       "docId" : 1,
       "lineId" : "line_1",
       "name" : "line one
    },
    {

       "docId" : 1,
       "lineId" : "line_2",
       "name" : "line two
     }
]

Please suggest a way to implement this..

Thanks!!

@dadoonet Please help!

Please be patient in waiting for responses to your question and refrain from pinging multiple times asking for a response or opening multiple topics for the same question. This is a community forum, it may take time for someone to reply to your question. For more information please refer to the Community Code of Conduct specifically the section "Be patient". Also, please refrain from pinging folks directly, this is a forum and anyone that participates might be able to assist you.

If you are in need of a service with an SLA that covers response times for questions then you may want to consider talking to us about a subscription.

It's fine to answer on your own thread after 2 or 3 days (not including weekends) if you don't have an answer.

I don't think you can generate 2 documents from a single one

Logstash has a split filter that can do this but as far as I know this in not possible in ingest pipelines.

Thanks @dadoonet and sorry for that, will make sure from now on.
Yes @Christian_Dahlqvist using the same split filter to perform this activity. Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.