How to capture regex groups with /g?

Hi,

I have following excerpt of a log line:

... [NO SORT], #0...#4000 (PageSize: 4001), , Fetched: 13, 687 ms0...END (PageSize: INFINITE), , Fetched: 147, 15 ms0...END (PageSize: INFINITE), , Fetched: 147, 16 ms0...END (PageSize: INFINITE), , Fetched: 147, 16 ms0...END (PageSize: INFINITE), , Fetched: 147, 15 ms0...END (PageSize: INFINITE), , Fetched: 147, 31 ms0...END (PageSize: INFINITE), , Fetched: 147, 47 ms0...END (PageSize: INFINITE), , Fetched: 147, 16 ms0...END (PageSize: INFINITE), , Fetched: 147, 15 ms0...END (PageSize: INFINITE), , Fetched: 147, 0 ms0...END (PageSize: INFINITE), , Fetched: 147, 32 ms0...END (PageSize: INFINITE), , Fetched: 147, 46 ms0...END (PageSize: INFINITE), , Fetched: 147, 16 ms0...END (PageSize: INFINITE), , Fetched: 147, 16 ms

I need to extract all given ms and aggregate them together and save them as field.
In regex I can get the values as following:

This gives me 14 regex groups back.

But how do I do it in logstash? And how do I aggregate them together? (value1+value2+value3...)

Thanks, Andreas

OK @asp,

Would you share the ruby filter here when you have it working?

I am currently heading for workaround it with the ruby filter.

I have written a ruby script which does the regex search and aggregates the data., but its not finished yet

I had a similar need and had to use the ruby filter too.
This example capture each block of alpha characters from the field 'input' and add them to a new field 'groups'.

filter {
    ruby {
      code => "event['groups'] = event['input'].downcase.scan(/[[:alpha:]]+/)"
    }
}

input = "Module1-Customer-Region_2-FuncX-Label42"
groups = [ "module", "customer", "region", "funcx", "label" ]