Spark Elasticsearch: Append new elements to nested array of objects

I have elasticsearch query to add new elements to a nested array of objects below

POST /transactions/_doc/1/_update
{
    "script": {
        "source": "ctx._source.transactions.addAll(params.transactions)",
        "params": {
            "transactions": [
                { "date": "2020-07-14T21:10:22Z", "amount": 890 },
                { "date": "2020-07-15T15:56:18Z", "amount": 54 }
            ]
        }
    }
}

How can I achieve this in a spark dataframe?
spark DF:

| id|        transactions|
+---+--------------------+
|  1|[[890, 2021-07-14...|
+---+--------------------+
 |-- id: integer (nullable = false)
 |-- transactions: array (nullable = true)
 |    |-- element: struct (containsNull = true)
 |    |    |-- amount: long (nullable = true)
 |    |    |-- date: string (nullable = true)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.