Creating global array in logstash and sharing the same

Hi,

I would like to create global array/map based on jdbc input plugin. And share the same array with in filebeat event.

Example(Sample Macro Code)

input {

jdbc {

// contains business logic details
}

beats {
//contains input data
}

}

filter {

if 'message' is from jdbc {
//create global array/ or map and holds the business logic rules
}

if 'message' is from 'filebeat' {
//use the global array business logic rule and apply it into each event of filebeat
}

output {
}

Please explain your end goal instead. In what way do you want to apply your global business logic rules? What is their function?

Hey @magnusbaeck , Thanks for your reply.

I had setup a filebeat to pull a logs. And i want to apply some business logic which is residing in DB and push the final result to elastic. How can I share business rules/logic in DB with filebeat events in logstash.

Options i explored:

  1. Pushing filebeat event(message) to http output plugin. And in that output http application i can access the business logic from DB , apply the business logic and finally push it to elasticsearch
  2. Output exec plugin (almost similar to option1)

Let me know if this cause any performance issue or is there any better approach?

Have you looked at the jdbc_streaming filter?

Hey @magnusbaeck, tried with jdbc_streaming and i am able to achieve my requirement. Thanks !!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.