Logstash - rabbitmq config to get multiple queues data to multiple elastic indeces

rabbitmq have different queues more than three. I want to make logstash configuration to get data from the rabbitmq queue and index this data to Elasticsearch. In order to that I have created logstash conf files for EVERY queue SEPARATELY with below configuration.

input {
  rabbitmq {
    id => "rabbitmyq_id0"
    # connect to rabbit
    host => "rabbitmq_hostname"
    port => 5672
    vhost => "/"
    user => "user"
    password => "pass"
   queue => "WebApiLog"
    ack => false
    durable => true
    exchange => "logstash_WebApiLog"
    exchange_type => "fanout"
    key => "WebApiLog"
    arguments => {
                         "x-queue-type" => "quorum"

output {
  elasticsearch {
    hosts => [ "https://elastichostname:9200" ]
   index => "webapi-logs-%{+YYYY.MM.dd}"

When I run the logstash. All of the rabbit exchange get in place and logstash start to collect data from rabbitmq and index this data to elasticsearch.

The problem:
Although I created separate config for every queues separately. When I check the created indices in elastic , I see that every indices has data for ALL queues, not just data of queue it created. I want to do every indices has data that belong to queue it is created for.

Can someone help me, how I can do that logstash configuration.

1 Like

How you create that?

What does your pipelines.yml looks like?

I didn't make any modification in pipelines.yml file. Shoul I do ?
My config file is in conf.d file and content of yml file is below

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"

If path.config points to multiple files then they are concatenated into a single configuration. Events from all of the inputs are sent to all of the outputs. You need to use pipelines.yml to run each configuration in a separate pipeline.

1 Like

If you want the pipelines to run independent from each other you need to edit pipelines.yml to do that.

The way it is configured now it will merge all the .conf files inside /etc/logstash/conf.d/ and run as one big pipeline, all the inputs will be sent to all the outputs, which is the situation you described.

Check the linke documentation on the previous answer on how to configure multiple pipelines.

1 Like

Thanks @leandrojmp, @Badger. It works such as my expectations after editing yml file :pray:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.