Add array items as prefixes for another array

Hi,
I am trying to push articles into the corresponding Kafka Topics.

article : {
 "topics": [
      "Barclays",
      "Companies",
      "Government",
      "Warrant"
    ],
    "categories": [
      "economy",
      "business",
      "finance"
    ]
}

the corresponding kafka topics are in the form of Category_Topics example:

["economy_Barclays","business_Barclays","finance_Barclays","economy_Government","business_Government","finance_Government"]

is this possible without using a Ruby filter ?

I used a Ruby Filter, and it is working correctly

input {
  elasticsearch {
    hosts => "localhost"
    index => "articles"
    add_field => { "kafka_topics" => '' }
  }
}

filter {
  ruby {
    code => "
	kafka_topics=[]
	categories=event.get('categories' )
	tpics=event.get('topics' )
	categories.each {|category| tpics.each{|topic| kafka_topics.push(category +'_'+ topic)} }
	event.set('kafka_topics', kafka_topics)
	"
  }
}
output {
  kafka {
    codec => json
    topic_id => "foo"
  }
}

My question here is there a way to push the event to multiple topics in Kafka.
exatly like the input plugin for kafka?

Well only possible way which I can see is to pass a dynamic value to topic_id

yes I am aware of dynamic values. the problem is that I don't have a predefined list of outputs or the number of outputs at least.
is there a way to execute a ruby code in the output?

I don't think that is possible , an alternate way to restrict is provide a list and use them using Prune filter.
Read through if that can work

does logstash supports partial config files?
can't I run a ruby filter and write the outputs in a config file then upload it.
does logstash loads the config before running?

I forked the logstash-output-kafka, it is now supporting the multi topics output

  kafka {
    topic_ids => ["foo","test"]
  }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.