Aggregate logs without special pattern or "flag" for the last of the group

I managed to find something, inspired by this usecase: Filter Plugin: Elasticsearch

It's using the ruby plugin and it should probrably be optimised. I changed my strategy since my first post.

filter {
ruby{
    init => "
        @@map ={}
        @@map['list_of_values_for_this_group'] = []
        @@map['group_of_previous_event'] = 'start'
        "
    code => "
        @@map['current_group'] = event['group_id']    
        if (@@map['group_of_previous_event'] != 'start' && @@map['group_of_previous_event'] != @@map['current_group'])  #if it's not the first event and if we just changed group, then reset the list
            @@map['list_of_values_for_this_group'] = []
        end
        @@map['list_of_values_for_this_group'].push(event['value']) #put the new value in the list
        event['values']=@@map['list_of_values_for_this_group'] #we need to publish the list at each event to avoid the last one to be lost
        @@map['group_of_previous_event'] = event['group_id']
    "
}
}

output {
elasticsearch {
  action =>"update"
  doc_as_upsert =>true
  index => "my_index"
  document_id => "%{group_id}" #each new event of a group updates the previous one
}
}

Hope it helps someone.