Array of objects - LogStash Conf

I'm currently working on fetching my data from MySQL and pushing it to ElasticSearch through LogStash, although I'm kind of having an issue with the logstash configuration

I basically have Products and each of those products can have a (BOGO - BUY ONE GET ONE) Deal, they are stored separately in their own tables & linked by a ONE -> MANY) relation.

To make things easier for me, I basically have a SQL View that joins everything for me in-order to easily create a JSON Object.

Since products and BOGO deals have a (ONE -> MANY) relation, a SELECT query using that view would return the product multiple times, each with a different BOGO deal.

What I want to do is collect all of those BOGO deals and place them in an array of objects.

Note that I'm trying to end up with a JSON Object that would look like this

  "id": "someId",
  "createdAt": "Date",
  "modifiedAt": "Date",

  "manufacturerCompany": {
    "id": "someId",
    "name": "someName",
    "createdAt": "Date",
    "modifiedAt": "Date"

  "bogoDeals": [
    {JSON Object}, {Json Object}, {Json OBJECT}

My current code looks similar to this, note that it does not work

logstash.conf - Filter section only

filter {
filter {
  mutate {
    #Product Manufacturing Company
    rename => {"manufacturer_company_id" => "[manufacturerCompany][id]"}
    rename => {"manufacturer_company_name" => "[manufacturerCompany][name]"}
    rename => {"manufacturer_company_created_at" => "[manufacturerCompany][createdAt]"}
    rename => {"manufacturer_company_modified_at" => "[manufacturerCompany][modifiedAt]"}

    rename => {"product_id" => "[id]"}
    rename => {"product_created_at" => "[createdAt]"}
    rename => {"product_modified_at" => "[modifiedAt]"}

  aggregate {
    task_id => "%{product_id}"
    code => "

    map['manufacturerCompany'] ||= []
    map['manufacturerCompany'] << {
      'id' => event.get('[manufacturerCompany][id]'),
      'name' => event.get('[manufacturerCompany][name]'),
      'createdAt' => event.get('[manufacturerCompany][createdAt]'),
      'modifiedAt' => event.get('[manufacturerCompany][modifiedAt]')

    map['bogoDeals'] ||= []
    map['bogoDeals'] << {
      'id' => event.get('bogo_deal_id'),
      'createdAt' => event.get('bogo_deal_created_at'),
      'modifiedAt' => event.get('bogo_deal_modified_at')


    push_previous_map_as_event => true
    push_map_as_event_on_timeout => true
    inactivity_timeout => 5
    timeout => 10

    mutate {
    remove_field => ["@version", "unix_ts_in_secs", "@timestamp", "tags"]

  ruby {
    path => "logstash-ruby.rb"
def filter(event)

  @bogoDeals = event.get('bogoDeals')
  puts @bogoDeals.inspect
  if @bogoDeals
    event.set('bogoDeals', @bogoDeals.uniq{|p| p["id"]})

  return [event]

For some reason all the bogoDeals are all null in ruby, is my way of doing this correct? if so what am I doing wrong here and why are the bogoDeals null in ruby, not that I have absolutely no experience in ruby, much appreciated.

You have already renamed product_id to id, so this will not do what you want. Also, you are creating map['manufacturerCompany'] as an array, which is not what it looks like in the desired output.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.