Cluster and scheduled pipelines

Setting up a Logstash cluster to run multiple pipelines. It will be running a number of scheduled JDBC pipelines, besides Beats. But I do not want to have the JDBC pipelines running in parallel, resulting in multiple records. One solution is to only deploy the scheduled pipelines to a single Logstash node, however complicating our deployment pipeline a bit. Is there any way to scope a pipeline to run on a single node, by a host name, tag or some other meta data?

There's no great way of doing this. You should be able to wrap all configuration files in the pipeline in conditionals like

if "${HOSTNAME}" == "" { ... }

assuming HOSTNAME is an environment variable that you define somewhere. So all your Logstash instances would technically run the same pipelines but on all hosts except one it wouldn't actually do anything.

Hi Magnus,

Thanks for the feedback. Do I have to wrap the configuration in both input and output or can use conditions in e.g. pipelines.yml?

Conditions aren't supported in pipelines.yml.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.