Explode a document into multiple documents by delimited text field

I have the following index.

colors,value
"blue,red", 10
"red", 20
"green", 5
"blue,red", 15
"blue,green", 5

I want to aggregate value by color like so.

blue = 10, 15, 5
red = 10, 20, 15
green = 5, 5

The problem is that individual colors appear as comma-delimited elements of a single text field.

If I was working in pandas, I would do an explode operation which would give me

blue, 10
red, 10
red, 20
green, 5
blue, 15
red, 15
blue, 5
green, 5

and I would do my aggregation on that.

I don't see an equivalent operation in Elasticsearch.

I don't want to modify the original index. Any explode and aggregate operations should occur at runtime.

What is the best way to do this?

I guess the splitting on delimiter part is easy enough, just use a split processor.

But the main question still applies: can I explode on an array field?

Is this the answer?

No, this applies to Logstash.

Elasticsearch uses Ingest Pipelines, there is no equivalent to the Logstash split filter.

You cannot split an array into multiple documents using Ingest Pipelines, if you want to do that you need to parse your data in Logstash and then send it to Elasticsearch.