Pipeline-to-pipeline communication - will passing variables work?

Hi!
We're designing our logstash configuration and trying to setup a flow when a pipeline receives events, then, based on the content of the event, assigns codec and destination file values (by adding custom fields), then sends it to the final output pipeline for output processing.

Will passing variables from one pipeline to another via custom fields work in this case?

Example:

Intake pipeline (filter section) - here we add custom field based on input content:

filter {
if [fields][application] and [fields][environment] and [fields][context] {
 mutate {
add_field => {
"[output_logger_filesystem]" => "true"
"[output_file_codec]" => "line { format => '%{[beat][hostname]} %{message}'}"
"[output_file_path]" => "/mnt/logstash_logs/%{[fields][application]}-%{[fields][region]}-%{[fields][environment]}/%{[fields][context]}.%{+YYYYMMddHH}.log"
}
}
}
}

Intake pipeline (output section) - here we send event to filesystem output pipeline if field 'output_logger_filesystem' equals true :

output {
# WRITE TO FILESYSTEM
if [output_logger_filesystem] == "true" {
pipeline { send_to => [loggerFilesystemTest] }
}
}

Output pipeline - and finally we write event to a filesystem based on codec and path variables passed from the previous pipeline:

FILESYSTEM OUTPUT

input { pipeline { address => loggerFilesystemTest } }

output {
if "metric" in [tags] {
  stdout {
    codec => line { format => "[logger-filesystem-test] Index rate: %{[events][rate_1m]}" }
  }
}

if [output_file_codec] {
  file {
    path => "%{[output_file_path]}"
    codec => "%{[output_file_codec]}"
  }
}

else {
  file {
    path => "%{[output_file_path]}"
  }
}
}

So, essentially, we're passing output_file_path and output_file_codec as variables to the output pipeline, and the output pipeline decides which codec to use and which file path to write to based on these variables.

Will this setup work?

Thank you!

Yes.