Multiple pipelines - distributor pattern

Okay. I have solved the pipeline virtual address problem. The name cannot contain - character, so renaming it to "legacyva" resolved my problem.

There is some weirdness with yaml that I don't quite understand. As you have described, putting quotes around the [type] variable causes the pipeline to fail to load. Also, setting the type in filebeat to anything other than log causes filebeat to be unable to start. I'm thinking it may be expecting certain inputs, but in any case I cannot seem to name this what I want, with or without quotes.

So I really need "tags" or some other construct in order to perform the loop properly, and I must put quotes and must not put dashes. I am concerned that using "tags" affects the logic, as I am only looking for a single flag match, and == I believe means matches exactly. I'll keep testing to see if I can work out this last piece.

edit> Did some searching on "logstash tags conditionals" and found that the syntax is different. Not sure where this is documented but I copied a gentleman's example and changed from if [tags] == "tagname" to if "tagname" in [tags]. I do feel better about this, even if I don't understand it, as that == was really bothering me since it needed to be more of a "contains" rather than "is equal to" (if I'm remembering things correctly). Anyways, glad I learned something, and now my conditional statement is catching my file based on a tag! HOORAY!

Big thanks to @Badger, I kept trying to drive off the course and you definitely got me back on path more than once!

For anyone else who is trying to do specifically this thing in the future, that is to say ship many different files with filebeat to a single logstash listener, then separate the pipelines by some pre-defined value so that many filters can be used and some modularity is provided for the index choice, here is the base template you need for your pipeline.conf files:

master-pipeline.conf
input {
  beats {
    port => "5000"
  }
}
output {
### conditional statement to separate logs by tag ###
  if "primary" in [tags] {
    pipeline { send_to => primaryvirtualaddress }
  } else {
    pipeline { send_to => fallbackvirtualaddress }
  }
}

each subsequent pipeline:

primary-pipeline.conf
input {
  pipeline {
    address => "primaryvirtualaddress"
  }
}
output {
  elasticsearch {
    hosts => ["ES CLUSTER ADDRESS"]
    index => "primary_index"
  }
}

You may be able to remove the quotes from the address, but I'm quite sick of re-testing at this point and I know this template works :slight_smile:

1 Like