How can I parse nested JSON object with arrays and dict into Elasticsearch using Logstash

I have a nested JSON object below.

{
  "data": {
    "testChannels": [
      {
        "name": "TC1",
        "Time": "2021-08-31 12:02:51.103188",
        "info": {
          "user": "johan",
          "epic": "Epic"
        },
        "template": {
          "name": "E"
        },
        "nodes": [
          {
            "instance": {
              "name": "u00871"
            }
          },
          {
            "instance": {
              "name": "u00388"
            }
          },
          {
            "instance": {
              "name": "qc-5946"
            }
          },
          {
            "instance": {
              "name": "us00009"
            }
          }
        ]
      },
      {
        "name": "TC4",
        "Time": "2021-09-21 13:23:02.985500",
        "info": {
          "user": "eak",
          "epic": "Ex"
        },
        "template": {
          "name": "AVI"
        },
        "nodes": [
          {
            "instance": {
              "name": "e0294"
            }
          },
          {
            "instance": {
              "name": "us00646"
            }
          },
          {
            "instance": {
              "name": "sim00183"
            }
          }
        ]
      }
    ]
  }
}

I want to store the data as two documents Tc1, TC4 with all the data in key value fashion.
Please help Thanks

You can use a split filter to extract two events from the [data][testChannels] array, then use mutate to move fields around if you do not like the data layout.