Modular pipeline construction with same input plugin (Kafka) but different topic name

Hi,

I am trying to construct modular multiple pipeline in logstash. I have the same input plugin from (kafka) in all pipelines but at each one some attributes like topic name and broker address different from each input plugin file

the following example for illustrate the problem.

this is my pipeline.yml

- pipeline.id: pipeline_1
  path.config: "/etc/logstash/conf.d/{01_in,01_filter1,02_filter2,01_out}.conf"

- pipeline.id: pipeline_2
  path.config: "/etc/logstash/conf.d/{02_in,03_filter3,02_filter2,02_out}.conf"

this is my input configuration files

01_in.conf file like the following

input {
    kafka {
        bootstrap_servers => "10.89.16.15:9092"
        topics => ["topic1"]
        enable_auto_commit => "false"
        auto_offset_reset => "earliest"
        group_id => "group-1"
    }
}

02_in.conf file like the following

input {
    kafka {
        bootstrap_servers => "10.89.16.12:9092"
        topics => ["topic2"]    
        enable_auto_commit => "false"
        auto_offset_reset => "earliest"
        group_id => "group-1"
    }
}

notice that bootstrap_server and topics are different from each file

my question is if I have 100 topic I will create 100 input config file which is have only 2 different fields, so i need an idea that i can passing topic name and server address to generic input.conf configuration file or pipeline.yml file or something like that

thanks in advance

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.