I would like to refactor my pipelines by using one of the pipeline-to-pipeline architecture patterns (the forked path pattern). The input of the upstream pipeline is a one-time query to an Elasticsearch cluster, using the elasticsearch input plugin. The number of inputs to downstream is therefore finite.
However, when the downstream pipelines have consumed and processed all the inputs, Logstash does not shut down - as it does, when I don't use pipeline-to-pipelines communication.
Is this the expected behaviour? Is there a way to shut down Logstash when all the downstream pipelines have processed all the events coming from upstream?