JSON array parsing in Logstash using the ruby filter

This is getting parsed properly according to Logstash, but it is getting indexed as a nested object in Elasticsearch. As they are not well supported in Kibana, I hope to create a flat structured array using the ruby filter. Here is my config:

input {
  file {
    path => "/home/fire.log"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    codec => json
  }
}
filter {
  ruby {
    code => "
      message_array = event.get('[alert][explanation][cnc-services][cnc-service]')
      if message_array.each_with_index { |item, index| event.set('cnc-service'+index.to_s, item) }
      end
    "
  }
  mutate {
    remove_field => ["[alert][explanation][os-changes]", "headers", "[alert][interface]", "alert", "appliance", "path", "product"]
  }
}
output {
  stdout { codec => rubydebug }
}

This gives me distinct cnc-service fields like [cnc-service1, cnc-service2, and so on] with a valid json object. I can definitely parse them using a json filter, but the number of cnc-service objects would vary and i do not wish to hard code it. Can I create a single hash object by merging them?

I am not familiar with ruby, in python I can easily do this as follows:

for k, v in chain(dict1.items(), dict2.items()):
    dict3[k].append(v)

Can I do something similar using the ruby filter?